41445 1727204180.90290: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-bGV executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 41445 1727204180.90567: Added group all to inventory 41445 1727204180.90569: Added group ungrouped to inventory 41445 1727204180.90571: Group all now contains ungrouped 41445 1727204180.90576: Examining possible inventory source: /tmp/network-zt6/inventory-rSl.yml 41445 1727204181.00418: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 41445 1727204181.00460: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 41445 1727204181.00477: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 41445 1727204181.00519: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 41445 1727204181.00566: Loaded config def from plugin (inventory/script) 41445 1727204181.00567: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 41445 1727204181.00595: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 41445 1727204181.00653: Loaded config def from plugin (inventory/yaml) 41445 1727204181.00655: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 41445 1727204181.00715: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 41445 1727204181.00992: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 41445 1727204181.00994: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 41445 1727204181.00996: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 41445 1727204181.01001: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 41445 1727204181.01004: Loading data from /tmp/network-zt6/inventory-rSl.yml 41445 1727204181.01044: /tmp/network-zt6/inventory-rSl.yml was not parsable by auto 41445 1727204181.01092: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 41445 1727204181.01122: Loading data from /tmp/network-zt6/inventory-rSl.yml 41445 1727204181.01176: group all already in inventory 41445 1727204181.01182: set inventory_file for managed-node1 41445 1727204181.01185: set inventory_dir for managed-node1 41445 1727204181.01185: Added host managed-node1 to inventory 41445 1727204181.01187: Added host managed-node1 to group all 41445 1727204181.01187: set ansible_host for managed-node1 41445 1727204181.01188: set ansible_ssh_extra_args for managed-node1 41445 1727204181.01190: set inventory_file for managed-node2 41445 1727204181.01192: set inventory_dir for managed-node2 41445 1727204181.01192: Added host managed-node2 to inventory 41445 1727204181.01193: Added host managed-node2 to group all 41445 1727204181.01193: set ansible_host for managed-node2 41445 1727204181.01194: set ansible_ssh_extra_args for managed-node2 41445 1727204181.01196: set inventory_file for managed-node3 41445 1727204181.01197: set inventory_dir for managed-node3 41445 1727204181.01197: Added host managed-node3 to inventory 41445 1727204181.01198: Added host managed-node3 to group all 41445 1727204181.01199: set ansible_host for managed-node3 41445 1727204181.01199: set ansible_ssh_extra_args for managed-node3 41445 1727204181.01201: Reconcile groups and hosts in inventory. 41445 1727204181.01203: Group ungrouped now contains managed-node1 41445 1727204181.01204: Group ungrouped now contains managed-node2 41445 1727204181.01205: Group ungrouped now contains managed-node3 41445 1727204181.01255: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 41445 1727204181.01334: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 41445 1727204181.01362: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 41445 1727204181.01384: Loaded config def from plugin (vars/host_group_vars) 41445 1727204181.01385: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 41445 1727204181.01390: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 41445 1727204181.01396: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 41445 1727204181.01423: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 41445 1727204181.01652: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204181.01724: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 41445 1727204181.01747: Loaded config def from plugin (connection/local) 41445 1727204181.01749: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 41445 1727204181.02126: Loaded config def from plugin (connection/paramiko_ssh) 41445 1727204181.02128: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 41445 1727204181.02690: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 41445 1727204181.02716: Loaded config def from plugin (connection/psrp) 41445 1727204181.02718: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 41445 1727204181.03122: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 41445 1727204181.03146: Loaded config def from plugin (connection/ssh) 41445 1727204181.03148: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 41445 1727204181.04452: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 41445 1727204181.04477: Loaded config def from plugin (connection/winrm) 41445 1727204181.04479: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 41445 1727204181.04500: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 41445 1727204181.04543: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 41445 1727204181.04585: Loaded config def from plugin (shell/cmd) 41445 1727204181.04587: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 41445 1727204181.04604: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 41445 1727204181.04641: Loaded config def from plugin (shell/powershell) 41445 1727204181.04642: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 41445 1727204181.04683: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 41445 1727204181.04787: Loaded config def from plugin (shell/sh) 41445 1727204181.04788: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 41445 1727204181.04810: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 41445 1727204181.04885: Loaded config def from plugin (become/runas) 41445 1727204181.04886: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 41445 1727204181.04994: Loaded config def from plugin (become/su) 41445 1727204181.04996: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 41445 1727204181.05091: Loaded config def from plugin (become/sudo) 41445 1727204181.05093: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 41445 1727204181.05116: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tests_route_table_nm.yml 41445 1727204181.05326: in VariableManager get_vars() 41445 1727204181.05341: done with get_vars() 41445 1727204181.05430: trying /usr/local/lib/python3.12/site-packages/ansible/modules 41445 1727204181.07693: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 41445 1727204181.07764: in VariableManager get_vars() 41445 1727204181.07767: done with get_vars() 41445 1727204181.07769: variable 'playbook_dir' from source: magic vars 41445 1727204181.07769: variable 'ansible_playbook_python' from source: magic vars 41445 1727204181.07770: variable 'ansible_config_file' from source: magic vars 41445 1727204181.07770: variable 'groups' from source: magic vars 41445 1727204181.07771: variable 'omit' from source: magic vars 41445 1727204181.07771: variable 'ansible_version' from source: magic vars 41445 1727204181.07771: variable 'ansible_check_mode' from source: magic vars 41445 1727204181.07772: variable 'ansible_diff_mode' from source: magic vars 41445 1727204181.07772: variable 'ansible_forks' from source: magic vars 41445 1727204181.07773: variable 'ansible_inventory_sources' from source: magic vars 41445 1727204181.07773: variable 'ansible_skip_tags' from source: magic vars 41445 1727204181.07774: variable 'ansible_limit' from source: magic vars 41445 1727204181.07774: variable 'ansible_run_tags' from source: magic vars 41445 1727204181.07774: variable 'ansible_verbosity' from source: magic vars 41445 1727204181.07798: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml 41445 1727204181.08227: in VariableManager get_vars() 41445 1727204181.08237: done with get_vars() 41445 1727204181.08258: in VariableManager get_vars() 41445 1727204181.08267: done with get_vars() 41445 1727204181.08296: in VariableManager get_vars() 41445 1727204181.08305: done with get_vars() 41445 1727204181.08334: in VariableManager get_vars() 41445 1727204181.08342: done with get_vars() 41445 1727204181.08345: variable 'omit' from source: magic vars 41445 1727204181.08356: variable 'omit' from source: magic vars 41445 1727204181.08382: in VariableManager get_vars() 41445 1727204181.08392: done with get_vars() 41445 1727204181.08428: in VariableManager get_vars() 41445 1727204181.08436: done with get_vars() 41445 1727204181.08459: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 41445 1727204181.08589: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 41445 1727204181.08677: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 41445 1727204181.09251: in VariableManager get_vars() 41445 1727204181.09268: done with get_vars() 41445 1727204181.09654: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 41445 1727204181.09772: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 41445 1727204181.11756: in VariableManager get_vars() 41445 1727204181.11775: done with get_vars() 41445 1727204181.11782: variable 'omit' from source: magic vars 41445 1727204181.11792: variable 'omit' from source: magic vars 41445 1727204181.11821: in VariableManager get_vars() 41445 1727204181.11834: done with get_vars() 41445 1727204181.11854: in VariableManager get_vars() 41445 1727204181.11868: done with get_vars() 41445 1727204181.11896: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 41445 1727204181.12004: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 41445 1727204181.12080: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 41445 1727204181.13561: in VariableManager get_vars() 41445 1727204181.13577: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 41445 1727204181.15099: in VariableManager get_vars() 41445 1727204181.15104: done with get_vars() 41445 1727204181.15106: variable 'playbook_dir' from source: magic vars 41445 1727204181.15107: variable 'ansible_playbook_python' from source: magic vars 41445 1727204181.15108: variable 'ansible_config_file' from source: magic vars 41445 1727204181.15109: variable 'groups' from source: magic vars 41445 1727204181.15110: variable 'omit' from source: magic vars 41445 1727204181.15110: variable 'ansible_version' from source: magic vars 41445 1727204181.15111: variable 'ansible_check_mode' from source: magic vars 41445 1727204181.15112: variable 'ansible_diff_mode' from source: magic vars 41445 1727204181.15112: variable 'ansible_forks' from source: magic vars 41445 1727204181.15113: variable 'ansible_inventory_sources' from source: magic vars 41445 1727204181.15114: variable 'ansible_skip_tags' from source: magic vars 41445 1727204181.15114: variable 'ansible_limit' from source: magic vars 41445 1727204181.15115: variable 'ansible_run_tags' from source: magic vars 41445 1727204181.15116: variable 'ansible_verbosity' from source: magic vars 41445 1727204181.15147: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml 41445 1727204181.15220: in VariableManager get_vars() 41445 1727204181.15223: done with get_vars() 41445 1727204181.15225: variable 'playbook_dir' from source: magic vars 41445 1727204181.15226: variable 'ansible_playbook_python' from source: magic vars 41445 1727204181.15227: variable 'ansible_config_file' from source: magic vars 41445 1727204181.15227: variable 'groups' from source: magic vars 41445 1727204181.15228: variable 'omit' from source: magic vars 41445 1727204181.15229: variable 'ansible_version' from source: magic vars 41445 1727204181.15229: variable 'ansible_check_mode' from source: magic vars 41445 1727204181.15230: variable 'ansible_diff_mode' from source: magic vars 41445 1727204181.15231: variable 'ansible_forks' from source: magic vars 41445 1727204181.15232: variable 'ansible_inventory_sources' from source: magic vars 41445 1727204181.15237: variable 'ansible_skip_tags' from source: magic vars 41445 1727204181.15238: variable 'ansible_limit' from source: magic vars 41445 1727204181.15238: variable 'ansible_run_tags' from source: magic vars 41445 1727204181.15239: variable 'ansible_verbosity' from source: magic vars 41445 1727204181.15268: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml 41445 1727204181.15344: in VariableManager get_vars() 41445 1727204181.15356: done with get_vars() 41445 1727204181.15394: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 41445 1727204181.15499: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 41445 1727204181.15573: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 41445 1727204181.15913: in VariableManager get_vars() 41445 1727204181.15927: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 41445 1727204181.17013: in VariableManager get_vars() 41445 1727204181.17026: done with get_vars() 41445 1727204181.17060: in VariableManager get_vars() 41445 1727204181.17062: done with get_vars() 41445 1727204181.17064: variable 'playbook_dir' from source: magic vars 41445 1727204181.17065: variable 'ansible_playbook_python' from source: magic vars 41445 1727204181.17066: variable 'ansible_config_file' from source: magic vars 41445 1727204181.17067: variable 'groups' from source: magic vars 41445 1727204181.17068: variable 'omit' from source: magic vars 41445 1727204181.17068: variable 'ansible_version' from source: magic vars 41445 1727204181.17069: variable 'ansible_check_mode' from source: magic vars 41445 1727204181.17070: variable 'ansible_diff_mode' from source: magic vars 41445 1727204181.17070: variable 'ansible_forks' from source: magic vars 41445 1727204181.17071: variable 'ansible_inventory_sources' from source: magic vars 41445 1727204181.17072: variable 'ansible_skip_tags' from source: magic vars 41445 1727204181.17073: variable 'ansible_limit' from source: magic vars 41445 1727204181.17073: variable 'ansible_run_tags' from source: magic vars 41445 1727204181.17074: variable 'ansible_verbosity' from source: magic vars 41445 1727204181.17106: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml 41445 1727204181.17171: in VariableManager get_vars() 41445 1727204181.17184: done with get_vars() 41445 1727204181.17222: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 41445 1727204181.17341: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 41445 1727204181.17414: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 41445 1727204181.17778: in VariableManager get_vars() 41445 1727204181.17795: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 41445 1727204181.19315: in VariableManager get_vars() 41445 1727204181.19328: done with get_vars() 41445 1727204181.19363: in VariableManager get_vars() 41445 1727204181.19375: done with get_vars() 41445 1727204181.19411: in VariableManager get_vars() 41445 1727204181.19422: done with get_vars() 41445 1727204181.19488: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 41445 1727204181.19518: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 41445 1727204181.19729: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 41445 1727204181.19822: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 41445 1727204181.19824: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-bGV/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) 41445 1727204181.19844: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 41445 1727204181.19860: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 41445 1727204181.19963: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 41445 1727204181.19999: Loaded config def from plugin (callback/default) 41445 1727204181.20001: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 41445 1727204181.20962: Loaded config def from plugin (callback/junit) 41445 1727204181.20965: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 41445 1727204181.21012: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 41445 1727204181.21074: Loaded config def from plugin (callback/minimal) 41445 1727204181.21079: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 41445 1727204181.21119: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 41445 1727204181.21180: Loaded config def from plugin (callback/tree) 41445 1727204181.21182: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 41445 1727204181.21295: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 41445 1727204181.21297: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-bGV/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_route_table_nm.yml ********************************************* 6 plays in /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tests_route_table_nm.yml 41445 1727204181.21328: in VariableManager get_vars() 41445 1727204181.21340: done with get_vars() 41445 1727204181.21346: in VariableManager get_vars() 41445 1727204181.21354: done with get_vars() 41445 1727204181.21357: variable 'omit' from source: magic vars 41445 1727204181.21393: in VariableManager get_vars() 41445 1727204181.21407: done with get_vars() 41445 1727204181.21427: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_route_table.yml' with nm as provider] ****** 41445 1727204181.22057: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 41445 1727204181.22140: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 41445 1727204181.22168: getting the remaining hosts for this loop 41445 1727204181.22170: done getting the remaining hosts for this loop 41445 1727204181.22173: getting the next task for host managed-node3 41445 1727204181.22180: done getting next task for host managed-node3 41445 1727204181.22181: ^ task is: TASK: Gathering Facts 41445 1727204181.22183: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204181.22185: getting variables 41445 1727204181.22186: in VariableManager get_vars() 41445 1727204181.22198: Calling all_inventory to load vars for managed-node3 41445 1727204181.22200: Calling groups_inventory to load vars for managed-node3 41445 1727204181.22203: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204181.22218: Calling all_plugins_play to load vars for managed-node3 41445 1727204181.22231: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204181.22235: Calling groups_plugins_play to load vars for managed-node3 41445 1727204181.22263: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204181.22312: done with get_vars() 41445 1727204181.22317: done getting variables 41445 1727204181.22389: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tests_route_table_nm.yml:6 Tuesday 24 September 2024 14:56:21 -0400 (0:00:00.011) 0:00:00.011 ***** 41445 1727204181.22407: entering _queue_task() for managed-node3/gather_facts 41445 1727204181.22411: Creating lock for gather_facts 41445 1727204181.22742: worker is 1 (out of 1 available) 41445 1727204181.22753: exiting _queue_task() for managed-node3/gather_facts 41445 1727204181.22765: done queuing things up, now waiting for results queue to drain 41445 1727204181.22767: waiting for pending results... 41445 1727204181.23194: running TaskExecutor() for managed-node3/TASK: Gathering Facts 41445 1727204181.23199: in run() - task 028d2410-947f-bf02-eee4-0000000000f5 41445 1727204181.23202: variable 'ansible_search_path' from source: unknown 41445 1727204181.23205: calling self._execute() 41445 1727204181.23224: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204181.23234: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204181.23246: variable 'omit' from source: magic vars 41445 1727204181.23346: variable 'omit' from source: magic vars 41445 1727204181.23372: variable 'omit' from source: magic vars 41445 1727204181.23404: variable 'omit' from source: magic vars 41445 1727204181.23456: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41445 1727204181.23500: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41445 1727204181.23526: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41445 1727204181.23556: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204181.23572: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204181.23611: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41445 1727204181.23619: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204181.23629: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204181.23743: Set connection var ansible_shell_executable to /bin/sh 41445 1727204181.23757: Set connection var ansible_shell_type to sh 41445 1727204181.23771: Set connection var ansible_pipelining to False 41445 1727204181.23793: Set connection var ansible_timeout to 10 41445 1727204181.23801: Set connection var ansible_connection to ssh 41445 1727204181.23818: Set connection var ansible_module_compression to ZIP_DEFLATED 41445 1727204181.23848: variable 'ansible_shell_executable' from source: unknown 41445 1727204181.23863: variable 'ansible_connection' from source: unknown 41445 1727204181.23872: variable 'ansible_module_compression' from source: unknown 41445 1727204181.23973: variable 'ansible_shell_type' from source: unknown 41445 1727204181.23981: variable 'ansible_shell_executable' from source: unknown 41445 1727204181.23985: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204181.23987: variable 'ansible_pipelining' from source: unknown 41445 1727204181.23990: variable 'ansible_timeout' from source: unknown 41445 1727204181.23991: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204181.24110: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41445 1727204181.24125: variable 'omit' from source: magic vars 41445 1727204181.24134: starting attempt loop 41445 1727204181.24139: running the handler 41445 1727204181.24158: variable 'ansible_facts' from source: unknown 41445 1727204181.24180: _low_level_execute_command(): starting 41445 1727204181.24194: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41445 1727204181.24969: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204181.25033: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204181.25050: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204181.25077: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204181.25194: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204181.26874: stdout chunk (state=3): >>>/root <<< 41445 1727204181.26979: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204181.27014: stderr chunk (state=3): >>><<< 41445 1727204181.27047: stdout chunk (state=3): >>><<< 41445 1727204181.27081: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204181.27101: _low_level_execute_command(): starting 41445 1727204181.27116: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204181.2708793-41482-79378652464911 `" && echo ansible-tmp-1727204181.2708793-41482-79378652464911="` echo /root/.ansible/tmp/ansible-tmp-1727204181.2708793-41482-79378652464911 `" ) && sleep 0' 41445 1727204181.28182: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204181.28186: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204181.28188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204181.28207: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204181.28213: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204181.28284: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204181.30087: stdout chunk (state=3): >>>ansible-tmp-1727204181.2708793-41482-79378652464911=/root/.ansible/tmp/ansible-tmp-1727204181.2708793-41482-79378652464911 <<< 41445 1727204181.30221: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204181.30234: stdout chunk (state=3): >>><<< 41445 1727204181.30248: stderr chunk (state=3): >>><<< 41445 1727204181.30281: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204181.2708793-41482-79378652464911=/root/.ansible/tmp/ansible-tmp-1727204181.2708793-41482-79378652464911 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204181.30320: variable 'ansible_module_compression' from source: unknown 41445 1727204181.30387: ANSIBALLZ: Using generic lock for ansible.legacy.setup 41445 1727204181.30397: ANSIBALLZ: Acquiring lock 41445 1727204181.30404: ANSIBALLZ: Lock acquired: 140182283768784 41445 1727204181.30411: ANSIBALLZ: Creating module 41445 1727204181.58129: ANSIBALLZ: Writing module into payload 41445 1727204181.58446: ANSIBALLZ: Writing module 41445 1727204181.58469: ANSIBALLZ: Renaming module 41445 1727204181.58567: ANSIBALLZ: Done creating module 41445 1727204181.58640: variable 'ansible_facts' from source: unknown 41445 1727204181.58643: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41445 1727204181.58646: _low_level_execute_command(): starting 41445 1727204181.58648: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 41445 1727204181.59857: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204181.59861: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204181.59971: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204181.59981: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found <<< 41445 1727204181.59999: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204181.60068: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204181.60072: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 <<< 41445 1727204181.60074: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204181.60140: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204181.60222: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204181.62035: stdout chunk (state=3): >>>PLATFORM <<< 41445 1727204181.62039: stdout chunk (state=3): >>>Linux FOUND <<< 41445 1727204181.62048: stdout chunk (state=3): >>>/usr/bin/python3.12 /usr/bin/python3 <<< 41445 1727204181.62378: stdout chunk (state=3): >>>/usr/bin/python3 ENDFOUND <<< 41445 1727204181.62382: stdout chunk (state=3): >>><<< 41445 1727204181.62384: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204181.62386: stderr chunk (state=3): >>><<< 41445 1727204181.62389: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204181.62395 [managed-node3]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 41445 1727204181.62398: _low_level_execute_command(): starting 41445 1727204181.62400: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 41445 1727204181.62635: Sending initial data 41445 1727204181.62638: Sent initial data (1181 bytes) 41445 1727204181.63665: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204181.63797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204181.63820: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204181.63834: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 <<< 41445 1727204181.63847: stderr chunk (state=3): >>>debug2: match found <<< 41445 1727204181.63887: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204181.63978: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204181.64191: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204181.64497: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204181.67820: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 41445 1727204181.68151: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204181.68191: stderr chunk (state=3): >>><<< 41445 1727204181.68195: stdout chunk (state=3): >>><<< 41445 1727204181.68213: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204181.68408: variable 'ansible_facts' from source: unknown 41445 1727204181.68413: variable 'ansible_facts' from source: unknown 41445 1727204181.68692: variable 'ansible_module_compression' from source: unknown 41445 1727204181.68696: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-414450s0ylvj1/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 41445 1727204181.68698: variable 'ansible_facts' from source: unknown 41445 1727204181.68727: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204181.2708793-41482-79378652464911/AnsiballZ_setup.py 41445 1727204181.69486: Sending initial data 41445 1727204181.69489: Sent initial data (153 bytes) 41445 1727204181.69796: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204181.69939: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204181.69956: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204181.69963: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204181.69978: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 <<< 41445 1727204181.70000: stderr chunk (state=3): >>>debug2: match not found <<< 41445 1727204181.70013: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204181.70024: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41445 1727204181.70032: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.22 is address <<< 41445 1727204181.70039: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41445 1727204181.70046: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204181.70126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204181.70129: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204181.70131: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 <<< 41445 1727204181.70133: stderr chunk (state=3): >>>debug2: match found <<< 41445 1727204181.70211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204181.70485: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204181.70503: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204181.70724: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204181.72289: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41445 1727204181.72314: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41445 1727204181.72474: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-414450s0ylvj1/tmph13y3zru /root/.ansible/tmp/ansible-tmp-1727204181.2708793-41482-79378652464911/AnsiballZ_setup.py <<< 41445 1727204181.72481: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204181.2708793-41482-79378652464911/AnsiballZ_setup.py" <<< 41445 1727204181.72484: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-414450s0ylvj1/tmph13y3zru" to remote "/root/.ansible/tmp/ansible-tmp-1727204181.2708793-41482-79378652464911/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204181.2708793-41482-79378652464911/AnsiballZ_setup.py" <<< 41445 1727204181.75071: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204181.75078: stdout chunk (state=3): >>><<< 41445 1727204181.75134: stderr chunk (state=3): >>><<< 41445 1727204181.75137: done transferring module to remote 41445 1727204181.75139: _low_level_execute_command(): starting 41445 1727204181.75141: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204181.2708793-41482-79378652464911/ /root/.ansible/tmp/ansible-tmp-1727204181.2708793-41482-79378652464911/AnsiballZ_setup.py && sleep 0' 41445 1727204181.76175: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204181.76372: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204181.76502: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204181.76606: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204181.78372: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204181.78379: stdout chunk (state=3): >>><<< 41445 1727204181.78394: stderr chunk (state=3): >>><<< 41445 1727204181.78403: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204181.78406: _low_level_execute_command(): starting 41445 1727204181.78504: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204181.2708793-41482-79378652464911/AnsiballZ_setup.py && sleep 0' 41445 1727204181.79477: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204181.79489: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204181.79498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204181.79514: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204181.79527: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 <<< 41445 1727204181.79864: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204181.79888: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204181.79952: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204181.82043: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 41445 1727204181.82067: stdout chunk (state=3): >>>import _imp # builtin <<< 41445 1727204181.82095: stdout chunk (state=3): >>>import '_thread' # <<< 41445 1727204181.82106: stdout chunk (state=3): >>>import '_warnings' # import '_weakref' # <<< 41445 1727204181.82278: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 41445 1727204181.82282: stdout chunk (state=3): >>>import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook <<< 41445 1727204181.82285: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 41445 1727204181.82329: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 41445 1727204181.82490: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778fbc4d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778f8bb00> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 41445 1727204181.82522: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778fbea50> import '_signal' # import '_abc' # import 'abc' # <<< 41445 1727204181.82538: stdout chunk (state=3): >>>import 'io' # <<< 41445 1727204181.82582: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 41445 1727204181.82652: stdout chunk (state=3): >>>import '_collections_abc' # <<< 41445 1727204181.82918: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778fcd130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778fce060> import 'site' # <<< 41445 1727204181.82945: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 41445 1727204181.83361: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 41445 1727204181.83384: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 41445 1727204181.83416: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 41445 1727204181.83478: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778dabe90> <<< 41445 1727204181.83539: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778dabf50> <<< 41445 1727204181.83575: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 41445 1727204181.83724: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 41445 1727204181.83740: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778de3890> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778de3f20> <<< 41445 1727204181.83763: stdout chunk (state=3): >>>import '_collections' # <<< 41445 1727204181.83827: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778dc3b60> <<< 41445 1727204181.83831: stdout chunk (state=3): >>>import '_functools' # <<< 41445 1727204181.83973: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778dc1280> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778da9040> <<< 41445 1727204181.83984: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 41445 1727204181.83987: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # <<< 41445 1727204181.84070: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 41445 1727204181.84087: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 41445 1727204181.84185: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778e07800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778e06420> <<< 41445 1727204181.84189: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778dc2150> <<< 41445 1727204181.84247: stdout chunk (state=3): >>>import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778daa900> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778e38890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778da82c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 41445 1727204181.84306: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5778e38d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778e38bf0> <<< 41445 1727204181.84398: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5778e38fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778da6de0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py <<< 41445 1727204181.84468: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778e39670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778e39370> import 'importlib.machinery' # <<< 41445 1727204181.84478: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 41445 1727204181.84654: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778e3a540> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 41445 1727204181.84661: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 41445 1727204181.84664: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778e54740> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5778e55e80> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 41445 1727204181.84666: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 41445 1727204181.84756: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778e56d20> <<< 41445 1727204181.84759: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5778e57350> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778e56270> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 41445 1727204181.84825: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5778e57dd0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778e57500> <<< 41445 1727204181.84874: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778e3a4b0> <<< 41445 1727204181.84912: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 41445 1727204181.85107: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 41445 1727204181.85113: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5778b47d40> <<< 41445 1727204181.85116: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 41445 1727204181.85129: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5778b70830> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778b70590> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5778b70770> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 41445 1727204181.85155: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 41445 1727204181.85284: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5778b71100> <<< 41445 1727204181.85400: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5778b71a60> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778b709b0> <<< 41445 1727204181.85442: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778b45ee0> <<< 41445 1727204181.85491: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 41445 1727204181.85540: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778b72e40> <<< 41445 1727204181.85568: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778b71b80> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778e3ac60> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 41445 1727204181.85752: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 41445 1727204181.85799: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778b9f1d0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 41445 1727204181.85820: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 41445 1727204181.85852: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778bc3560> <<< 41445 1727204181.85871: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 41445 1727204181.85915: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 41445 1727204181.86039: stdout chunk (state=3): >>>import 'ntpath' # <<< 41445 1727204181.86102: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778c202f0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 41445 1727204181.86107: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 41445 1727204181.86130: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 41445 1727204181.86191: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778c22a50> <<< 41445 1727204181.86262: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778c20410> <<< 41445 1727204181.86340: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778be9310> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778525400> <<< 41445 1727204181.86349: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778bc2360> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778b73da0> <<< 41445 1727204181.86604: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 41445 1727204181.86610: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f5778bc2960> <<< 41445 1727204181.87015: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_6o6c1ev2/ansible_ansible.legacy.setup_payload.zip' <<< 41445 1727204181.87047: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204181.87190: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204181.87201: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 41445 1727204181.87270: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 41445 1727204181.87546: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f577858b050> import '_typing' # <<< 41445 1727204181.87579: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778569f40> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778569160> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available <<< 41445 1727204181.87587: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204181.87607: stdout chunk (state=3): >>>import 'ansible.module_utils' # <<< 41445 1727204181.87613: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204181.89020: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204181.90123: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py <<< 41445 1727204181.90127: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778589340> <<< 41445 1727204181.90153: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 41445 1727204181.90182: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 41445 1727204181.90205: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 41445 1727204181.90239: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' <<< 41445 1727204181.90251: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f57785be990> <<< 41445 1727204181.90275: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57785be720> <<< 41445 1727204181.90308: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57785be030> <<< 41445 1727204181.90337: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 41445 1727204181.90341: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 41445 1727204181.90384: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57785beb10> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f577858bce0> import 'atexit' # <<< 41445 1727204181.90413: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f57785bf680> <<< 41445 1727204181.90459: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' <<< 41445 1727204181.90462: stdout chunk (state=3): >>>import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f57785bf8c0> <<< 41445 1727204181.90491: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 41445 1727204181.90506: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 41445 1727204181.90518: stdout chunk (state=3): >>>import '_locale' # <<< 41445 1727204181.90587: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57785bfe00> import 'pwd' # <<< 41445 1727204181.90605: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 41445 1727204181.90622: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 41445 1727204181.90657: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778425af0> <<< 41445 1727204181.90692: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 41445 1727204181.90711: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f57784277a0> <<< 41445 1727204181.90735: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 41445 1727204181.90998: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f577842c110> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f577842d280> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f577842fd70> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5778e56c90> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f577842df40> <<< 41445 1727204181.91015: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 41445 1727204181.91040: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 41445 1727204181.91062: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py <<< 41445 1727204181.91081: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 41445 1727204181.91090: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 41445 1727204181.91200: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 41445 1727204181.91220: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 41445 1727204181.91240: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778433c80> import '_tokenize' # <<< 41445 1727204181.91307: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778432780> <<< 41445 1727204181.91338: stdout chunk (state=3): >>>import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57784324e0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 41445 1727204181.91408: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778432a20> <<< 41445 1727204181.91436: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f577842e540> <<< 41445 1727204181.91462: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5778477f50> <<< 41445 1727204181.91492: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778477950> <<< 41445 1727204181.91519: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 41445 1727204181.91535: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 41445 1727204181.91557: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 41445 1727204181.91598: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5778479af0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57784798b0> <<< 41445 1727204181.91617: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 41445 1727204181.91639: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 41445 1727204181.91702: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f577847bfb0> <<< 41445 1727204181.91713: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f577847a0f0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 41445 1727204181.92006: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f577847f6e0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f577847bf20> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f57784806e0> <<< 41445 1727204181.92039: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5778480770> <<< 41445 1727204181.92080: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' <<< 41445 1727204181.92115: stdout chunk (state=3): >>># extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5778480980> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778478170> <<< 41445 1727204181.92177: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 41445 1727204181.92205: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5778483fb0> <<< 41445 1727204181.92361: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f577830d0a0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778482780> <<< 41445 1727204181.92397: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' <<< 41445 1727204181.92472: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5778483b00> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57784823c0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 41445 1727204181.92581: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204181.92678: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 41445 1727204181.92693: stdout chunk (state=3): >>>import 'ansible.module_utils.common' # <<< 41445 1727204181.92719: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 41445 1727204181.92798: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204181.92913: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204181.93414: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204181.93939: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 41445 1727204181.93964: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 41445 1727204181.93994: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 41445 1727204181.94037: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' <<< 41445 1727204181.94056: stdout chunk (state=3): >>># extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f57783111f0> <<< 41445 1727204181.94136: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778311fa0> <<< 41445 1727204181.94160: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f577830d370> <<< 41445 1727204181.94214: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available <<< 41445 1727204181.94369: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils._text' # <<< 41445 1727204181.94397: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 41445 1727204181.94639: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57783120f0> # zipimport: zlib available <<< 41445 1727204181.95015: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204181.95456: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204181.95531: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204181.95596: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 41445 1727204181.95614: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204181.95640: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204181.95684: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 41445 1727204181.95752: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204181.95853: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 41445 1727204181.95886: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204181.95889: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 41445 1727204181.95912: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204181.95923: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204181.95964: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 41445 1727204181.95973: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204181.96189: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204181.96413: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 41445 1727204181.96473: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # <<< 41445 1727204181.96549: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778313260> <<< 41445 1727204181.96564: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204181.96626: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204181.96716: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 41445 1727204181.96736: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204181.96783: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204181.96827: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 41445 1727204181.96831: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204181.96866: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204181.96914: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204181.96964: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204181.97033: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 41445 1727204181.97071: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 41445 1727204181.97153: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f577831dd30> <<< 41445 1727204181.97222: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f577831b170> <<< 41445 1727204181.97226: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 41445 1727204181.97242: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204181.97293: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204181.97354: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204181.97411: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204181.97439: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 41445 1727204181.97682: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 41445 1727204181.97685: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 41445 1727204181.97687: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 41445 1727204181.97689: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57784067e0> <<< 41445 1727204181.97694: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57784fe4b0> <<< 41445 1727204181.97786: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f577831df10> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778315490> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 41445 1727204181.97819: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 41445 1727204181.98103: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 41445 1727204181.98120: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204181.98155: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204181.98603: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available <<< 41445 1727204181.98785: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204181.98973: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57783b1ac0> <<< 41445 1727204181.99005: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 41445 1727204181.99011: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 41445 1727204181.99080: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 41445 1727204181.99083: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 41445 1727204181.99099: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5777f6fc20> <<< 41445 1727204181.99201: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5777f6ffb0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f577839a5a0> <<< 41445 1727204181.99222: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57783b2630> <<< 41445 1727204181.99332: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57783b0170> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57783b0500> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 41445 1727204181.99603: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5777f87020> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5777f868d0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5777f86a80> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5777f85d00> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5777f87050> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 41445 1727204181.99798: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5777fddaf0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5777f87b00> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57783b1250> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available <<< 41445 1727204181.99832: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204181.99886: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 41445 1727204181.99904: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204182.00098: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available <<< 41445 1727204182.00106: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available <<< 41445 1727204182.00207: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204182.00215: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available <<< 41445 1727204182.00495: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 41445 1727204182.00528: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # <<< 41445 1727204182.00546: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available <<< 41445 1727204182.01020: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204182.01500: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 41445 1727204182.01503: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204182.01672: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available <<< 41445 1727204182.01697: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available <<< 41445 1727204182.01933: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204182.01937: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 41445 1727204182.01943: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204182.01945: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204182.01948: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 41445 1727204182.01950: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204182.01952: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204182.01954: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 41445 1727204182.01963: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204182.02067: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204182.02132: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 41445 1727204182.02148: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5777fded50> <<< 41445 1727204182.02171: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 41445 1727204182.02198: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 41445 1727204182.02482: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5777fde1e0> import 'ansible.module_utils.facts.system.local' # <<< 41445 1727204182.02485: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204182.02487: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204182.02547: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available <<< 41445 1727204182.02562: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204182.02637: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 41445 1727204182.02648: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204182.02708: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204182.02880: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available <<< 41445 1727204182.02885: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204182.02888: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 41445 1727204182.02921: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 41445 1727204182.03045: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 41445 1727204182.03099: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f577801dc40> <<< 41445 1727204182.03324: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778002ae0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available <<< 41445 1727204182.03348: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 41445 1727204182.03368: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204182.03444: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204182.03522: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204182.03646: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204182.03782: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available <<< 41445 1727204182.03826: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204182.03867: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 41445 1727204182.03879: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204182.03914: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204182.03971: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py <<< 41445 1727204182.04101: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5778025910> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f577801d430> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available <<< 41445 1727204182.04150: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 41445 1727204182.04184: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204182.04458: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204182.04465: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 41445 1727204182.04472: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204182.04659: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204182.04669: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204182.04708: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204182.04746: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # <<< 41445 1727204182.04895: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available <<< 41445 1727204182.04903: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 41445 1727204182.04948: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204182.05094: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # <<< 41445 1727204182.05111: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available <<< 41445 1727204182.05223: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204182.05417: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 41445 1727204182.05420: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204182.05495: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 41445 1727204182.06070: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204182.06489: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # <<< 41445 1727204182.06505: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available <<< 41445 1727204182.06604: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204182.06814: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available <<< 41445 1727204182.06826: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204182.06914: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 41445 1727204182.06955: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204182.07073: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204182.07324: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available <<< 41445 1727204182.07351: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 41445 1727204182.07391: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204182.07459: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204182.07556: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204182.07762: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204182.08157: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available <<< 41445 1727204182.08161: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204182.08163: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # <<< 41445 1727204182.08168: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204182.08300: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available <<< 41445 1727204182.08379: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204182.08418: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 41445 1727204182.08432: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204182.08482: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204182.08537: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 41445 1727204182.08694: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204182.08814: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204182.09137: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 41445 1727204182.09140: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204182.09143: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204182.09302: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available <<< 41445 1727204182.09314: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204182.09343: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 41445 1727204182.09356: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204182.09482: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204182.09606: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # <<< 41445 1727204182.09624: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204182.09670: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204182.09717: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 41445 1727204182.09730: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204182.09741: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204182.09895: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 41445 1727204182.09928: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204182.10000: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # <<< 41445 1727204182.10017: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available <<< 41445 1727204182.10195: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 41445 1727204182.10317: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204182.10505: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 41445 1727204182.10516: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204182.10723: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204182.10726: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 41445 1727204182.10729: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204182.10731: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204182.10733: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 41445 1727204182.10849: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 41445 1727204182.10888: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 41445 1727204182.10899: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204182.10987: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204182.11084: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 41445 1727204182.11294: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204182.11864: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 41445 1727204182.11892: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 41445 1727204182.11905: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 41445 1727204182.11936: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' <<< 41445 1727204182.11948: stdout chunk (state=3): >>># extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5777dbaba0> <<< 41445 1727204182.11963: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5777db9040> <<< 41445 1727204182.11999: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5777db3d40> <<< 41445 1727204182.27454: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py <<< 41445 1727204182.27485: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5777e018b0> <<< 41445 1727204182.27580: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py <<< 41445 1727204182.27584: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' <<< 41445 1727204182.27586: stdout chunk (state=3): >>>import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5777e03b30> <<< 41445 1727204182.27607: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py <<< 41445 1727204182.27639: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' <<< 41445 1727204182.27674: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' <<< 41445 1727204182.27694: stdout chunk (state=3): >>>import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5777e4cb60> <<< 41445 1727204182.27707: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5777e4c680> <<< 41445 1727204182.28082: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 41445 1727204182.48194: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_is_chroot": false, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_local": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "56", "second": "22", "epoch": "1727204182", "epoch_int": "1727204182", "date": "2024-09-24", "time": "14:56:22", "iso8601_micro": "2024-09-24T18:56:22.113120Z", "iso8601": "2024-09-24T18:56:22Z", "iso8601_basic": "20240924T145622113120", "iso8601_basic_short": "20240924T145622", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec25272c602494034078bc876e25857f", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 53526 10.31.47.22 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 53526 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_lsb": {}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC83yKiaGAYjfqsqlfpPMYFAYI2IZVpX8WgNfvPCdI/QOKjuzs4b6SbV/Bm0ogtp9Et9STTGoHBvp3tMYQ6i0y/9DHMBxtiHYJ+rIzJ/YNNMqKc+fMqexyOsi7rKKuzfxXrYU8uPBHq5WU3WAZmJxJn+UHZGog8hUnJ8momdJG+aYo9El3Qce4gVdwORcmHZUOa49M8lLCwTovtYArmkGETUVJ+Jk8huVTzYpASWxxcw6zOvUcn52HC6dmNQv/T+k2uW6UW0rybwIrVUlZXRNODrXs8kCGgOx1OI0XYB3FndJOnORF4A9Y6onLo/zUCEaO8Pi19mcfSbo2v+bmotTVk5jcmvR3jhVYJmJE6a+dQpjSZolSqMv8mI9tkztfxM6bJlNSZcTrvZEzu7cbiE38Pp/Ku143n9iGgWNmUQ2FhUDpoWEhoA767VUunE48P8ivpVZ/u5aEyupZSLEuWEvCLmmGPVcf9hVbcXw0n8RFvUwDdD8WARVhN5GJFUVN5JM0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHm1sMDuVWGACN5pHFDkl6tR93F90YCY4cFGcXcCoQnN+oT963FmBwTMMlfDIm4G2OUATCZuz6QFZP9trAaUzXo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIAB71QKijTWbanEvrb0ex0kLr0wX6qyv6naldRWNiIFP", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_apparmor": {"status": "disabled"}, "ansible_fibre_channel_wwn": [], "ansible_iscsi_iqn": "", "ansible_fips": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_loadavg": {"1m": 0.6201171875, "5m": 0.52392578125, "15m": 0.30419921875}, "ansible_interfaces": ["rpltstbr", "lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "02:83:38:1a:ae:4d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.47.22", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::83:38ff:fe1a:ae4d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "26:cf:9a:9b:f7:ee", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_fi<<< 41445 1727204182.48228: stdout chunk (state=3): >>>lters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.47.22", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:83:38:1a:ae:4d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.47.22", "192.0.2.72"], "ansible_all_ipv6_addresses": ["fe80::83:38ff:fe1a:ae4d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.47.22", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::83:38ff:fe1a:ae4d"]}, "ansible_pkg_mgr": "dnf", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2943, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 588, "free": 2943}, "nocache": {"free": 3283, "used": 248}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec25272c-6024-9403-4078-bc876e25857f", "ansible_product_uuid": "ec25272c-6024-9403-4078-bc876e25857f", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 759, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261789200384, "block_size": 4096, "block_total": 65519099, "block_available": 63913379, "block_used": 1605720, "inode_total": 131070960, "inode_available": 131027343, "inode_used": 43617, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 41445 1727204182.48804: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type <<< 41445 1727204182.48841: stdout chunk (state=3): >>># clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins <<< 41445 1727204182.48891: stdout chunk (state=3): >>># cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants <<< 41445 1727204182.48936: stdout chunk (state=3): >>># cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat <<< 41445 1727204182.48981: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters <<< 41445 1727204182.48984: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro <<< 41445 1727204182.49062: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux <<< 41445 1727204182.49080: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps<<< 41445 1727204182.49123: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly <<< 41445 1727204182.49157: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 41445 1727204182.49495: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 41445 1727204182.49498: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 41445 1727204182.49549: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 <<< 41445 1727204182.49552: stdout chunk (state=3): >>># destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 41445 1727204182.49598: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress <<< 41445 1727204182.49601: stdout chunk (state=3): >>># destroy ntpath <<< 41445 1727204182.49651: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder <<< 41445 1727204182.49655: stdout chunk (state=3): >>># destroy json.scanner # destroy _json <<< 41445 1727204182.49687: stdout chunk (state=3): >>># destroy grp # destroy encodings # destroy _locale <<< 41445 1727204182.49712: stdout chunk (state=3): >>># destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid <<< 41445 1727204182.49741: stdout chunk (state=3): >>># destroy selinux # destroy shutil <<< 41445 1727204182.49753: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 41445 1727204182.49823: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector <<< 41445 1727204182.49833: stdout chunk (state=3): >>># destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue <<< 41445 1727204182.49859: stdout chunk (state=3): >>># destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors <<< 41445 1727204182.49905: stdout chunk (state=3): >>># destroy shlex # destroy fcntl # destroy datetime <<< 41445 1727204182.49933: stdout chunk (state=3): >>># destroy subprocess # destroy base64 # destroy _ssl <<< 41445 1727204182.49962: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json <<< 41445 1727204182.50013: stdout chunk (state=3): >>># destroy socket # destroy struct <<< 41445 1727204182.50016: stdout chunk (state=3): >>># destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing <<< 41445 1727204182.50041: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection <<< 41445 1727204182.50082: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes <<< 41445 1727204182.50114: stdout chunk (state=3): >>># cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket <<< 41445 1727204182.50137: stdout chunk (state=3): >>># cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform <<< 41445 1727204182.50189: stdout chunk (state=3): >>># cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg <<< 41445 1727204182.50217: stdout chunk (state=3): >>># cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections <<< 41445 1727204182.50231: stdout chunk (state=3): >>># destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat <<< 41445 1727204182.50273: stdout chunk (state=3): >>># destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread <<< 41445 1727204182.50298: stdout chunk (state=3): >>># cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 41445 1727204182.50438: stdout chunk (state=3): >>># destroy sys.monitoring <<< 41445 1727204182.50453: stdout chunk (state=3): >>># destroy _socket # destroy _collections <<< 41445 1727204182.50498: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser <<< 41445 1727204182.50526: stdout chunk (state=3): >>># destroy tokenize <<< 41445 1727204182.50539: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing <<< 41445 1727204182.50570: stdout chunk (state=3): >>># destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 41445 1727204182.50607: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 41445 1727204182.50613: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 41445 1727204182.50727: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect <<< 41445 1727204182.50730: stdout chunk (state=3): >>># destroy time <<< 41445 1727204182.50769: stdout chunk (state=3): >>># destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re <<< 41445 1727204182.50801: stdout chunk (state=3): >>># destroy itertools <<< 41445 1727204182.50823: stdout chunk (state=3): >>># destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 41445 1727204182.51389: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. <<< 41445 1727204182.51392: stdout chunk (state=3): >>><<< 41445 1727204182.51395: stderr chunk (state=3): >>><<< 41445 1727204182.51487: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778fbc4d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778f8bb00> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778fbea50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778fcd130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778fce060> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778dabe90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778dabf50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778de3890> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778de3f20> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778dc3b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778dc1280> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778da9040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778e07800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778e06420> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778dc2150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778daa900> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778e38890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778da82c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5778e38d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778e38bf0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5778e38fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778da6de0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778e39670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778e39370> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778e3a540> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778e54740> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5778e55e80> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778e56d20> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5778e57350> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778e56270> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5778e57dd0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778e57500> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778e3a4b0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5778b47d40> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5778b70830> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778b70590> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5778b70770> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5778b71100> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5778b71a60> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778b709b0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778b45ee0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778b72e40> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778b71b80> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778e3ac60> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778b9f1d0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778bc3560> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778c202f0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778c22a50> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778c20410> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778be9310> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778525400> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778bc2360> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778b73da0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f5778bc2960> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_6o6c1ev2/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f577858b050> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778569f40> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778569160> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778589340> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f57785be990> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57785be720> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57785be030> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57785beb10> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f577858bce0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f57785bf680> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f57785bf8c0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57785bfe00> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778425af0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f57784277a0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f577842c110> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f577842d280> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f577842fd70> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5778e56c90> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f577842df40> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778433c80> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778432780> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57784324e0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778432a20> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f577842e540> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5778477f50> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778477950> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5778479af0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57784798b0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f577847bfb0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f577847a0f0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f577847f6e0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f577847bf20> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f57784806e0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5778480770> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5778480980> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778478170> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5778483fb0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f577830d0a0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778482780> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5778483b00> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57784823c0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f57783111f0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778311fa0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f577830d370> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57783120f0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778313260> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f577831dd30> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f577831b170> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57784067e0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57784fe4b0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f577831df10> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778315490> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57783b1ac0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5777f6fc20> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5777f6ffb0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f577839a5a0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57783b2630> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57783b0170> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57783b0500> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5777f87020> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5777f868d0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5777f86a80> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5777f85d00> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5777f87050> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5777fddaf0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5777f87b00> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f57783b1250> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5777fded50> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5777fde1e0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f577801dc40> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5778002ae0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5778025910> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f577801d430> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5777dbaba0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5777db9040> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5777db3d40> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5777e018b0> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5777e03b30> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5777e4cb60> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5777e4c680> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_is_chroot": false, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_local": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "56", "second": "22", "epoch": "1727204182", "epoch_int": "1727204182", "date": "2024-09-24", "time": "14:56:22", "iso8601_micro": "2024-09-24T18:56:22.113120Z", "iso8601": "2024-09-24T18:56:22Z", "iso8601_basic": "20240924T145622113120", "iso8601_basic_short": "20240924T145622", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec25272c602494034078bc876e25857f", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 53526 10.31.47.22 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 53526 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_lsb": {}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC83yKiaGAYjfqsqlfpPMYFAYI2IZVpX8WgNfvPCdI/QOKjuzs4b6SbV/Bm0ogtp9Et9STTGoHBvp3tMYQ6i0y/9DHMBxtiHYJ+rIzJ/YNNMqKc+fMqexyOsi7rKKuzfxXrYU8uPBHq5WU3WAZmJxJn+UHZGog8hUnJ8momdJG+aYo9El3Qce4gVdwORcmHZUOa49M8lLCwTovtYArmkGETUVJ+Jk8huVTzYpASWxxcw6zOvUcn52HC6dmNQv/T+k2uW6UW0rybwIrVUlZXRNODrXs8kCGgOx1OI0XYB3FndJOnORF4A9Y6onLo/zUCEaO8Pi19mcfSbo2v+bmotTVk5jcmvR3jhVYJmJE6a+dQpjSZolSqMv8mI9tkztfxM6bJlNSZcTrvZEzu7cbiE38Pp/Ku143n9iGgWNmUQ2FhUDpoWEhoA767VUunE48P8ivpVZ/u5aEyupZSLEuWEvCLmmGPVcf9hVbcXw0n8RFvUwDdD8WARVhN5GJFUVN5JM0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHm1sMDuVWGACN5pHFDkl6tR93F90YCY4cFGcXcCoQnN+oT963FmBwTMMlfDIm4G2OUATCZuz6QFZP9trAaUzXo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIAB71QKijTWbanEvrb0ex0kLr0wX6qyv6naldRWNiIFP", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_apparmor": {"status": "disabled"}, "ansible_fibre_channel_wwn": [], "ansible_iscsi_iqn": "", "ansible_fips": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_loadavg": {"1m": 0.6201171875, "5m": 0.52392578125, "15m": 0.30419921875}, "ansible_interfaces": ["rpltstbr", "lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "02:83:38:1a:ae:4d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.47.22", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::83:38ff:fe1a:ae4d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "26:cf:9a:9b:f7:ee", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.47.22", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:83:38:1a:ae:4d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.47.22", "192.0.2.72"], "ansible_all_ipv6_addresses": ["fe80::83:38ff:fe1a:ae4d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.47.22", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::83:38ff:fe1a:ae4d"]}, "ansible_pkg_mgr": "dnf", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2943, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 588, "free": 2943}, "nocache": {"free": 3283, "used": 248}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec25272c-6024-9403-4078-bc876e25857f", "ansible_product_uuid": "ec25272c-6024-9403-4078-bc876e25857f", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 759, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261789200384, "block_size": 4096, "block_total": 65519099, "block_available": 63913379, "block_used": 1605720, "inode_total": 131070960, "inode_available": 131027343, "inode_used": 43617, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed-node3 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 41445 1727204182.53414: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204181.2708793-41482-79378652464911/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41445 1727204182.53418: _low_level_execute_command(): starting 41445 1727204182.53420: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204181.2708793-41482-79378652464911/ > /dev/null 2>&1 && sleep 0' 41445 1727204182.53759: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204182.53781: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204182.53803: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204182.53826: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204182.53851: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 <<< 41445 1727204182.53924: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204182.53972: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204182.53992: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204182.54030: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204182.54094: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204182.55970: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204182.56005: stdout chunk (state=3): >>><<< 41445 1727204182.56021: stderr chunk (state=3): >>><<< 41445 1727204182.56041: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204182.56053: handler run complete 41445 1727204182.56282: variable 'ansible_facts' from source: unknown 41445 1727204182.56355: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204182.57061: variable 'ansible_facts' from source: unknown 41445 1727204182.57143: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204182.57319: attempt loop complete, returning result 41445 1727204182.57333: _execute() done 41445 1727204182.57340: dumping result to json 41445 1727204182.57384: done dumping result, returning 41445 1727204182.57438: done running TaskExecutor() for managed-node3/TASK: Gathering Facts [028d2410-947f-bf02-eee4-0000000000f5] 41445 1727204182.57536: sending task result for task 028d2410-947f-bf02-eee4-0000000000f5 41445 1727204182.58380: done sending task result for task 028d2410-947f-bf02-eee4-0000000000f5 ok: [managed-node3] 41445 1727204182.59038: no more pending results, returning what we have 41445 1727204182.59042: results queue empty 41445 1727204182.59043: checking for any_errors_fatal 41445 1727204182.59044: done checking for any_errors_fatal 41445 1727204182.59045: checking for max_fail_percentage 41445 1727204182.59046: done checking for max_fail_percentage 41445 1727204182.59047: checking to see if all hosts have failed and the running result is not ok 41445 1727204182.59048: done checking to see if all hosts have failed 41445 1727204182.59049: getting the remaining hosts for this loop 41445 1727204182.59050: done getting the remaining hosts for this loop 41445 1727204182.59082: getting the next task for host managed-node3 41445 1727204182.59088: done getting next task for host managed-node3 41445 1727204182.59090: ^ task is: TASK: meta (flush_handlers) 41445 1727204182.59092: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204182.59096: getting variables 41445 1727204182.59097: in VariableManager get_vars() 41445 1727204182.59169: Calling all_inventory to load vars for managed-node3 41445 1727204182.59172: Calling groups_inventory to load vars for managed-node3 41445 1727204182.59178: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204182.59222: WORKER PROCESS EXITING 41445 1727204182.59232: Calling all_plugins_play to load vars for managed-node3 41445 1727204182.59235: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204182.59238: Calling groups_plugins_play to load vars for managed-node3 41445 1727204182.59668: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204182.59938: done with get_vars() 41445 1727204182.59954: done getting variables 41445 1727204182.60070: in VariableManager get_vars() 41445 1727204182.60083: Calling all_inventory to load vars for managed-node3 41445 1727204182.60085: Calling groups_inventory to load vars for managed-node3 41445 1727204182.60088: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204182.60092: Calling all_plugins_play to load vars for managed-node3 41445 1727204182.60094: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204182.60097: Calling groups_plugins_play to load vars for managed-node3 41445 1727204182.60307: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204182.60792: done with get_vars() 41445 1727204182.60815: done queuing things up, now waiting for results queue to drain 41445 1727204182.60818: results queue empty 41445 1727204182.60819: checking for any_errors_fatal 41445 1727204182.60821: done checking for any_errors_fatal 41445 1727204182.60826: checking for max_fail_percentage 41445 1727204182.60827: done checking for max_fail_percentage 41445 1727204182.60828: checking to see if all hosts have failed and the running result is not ok 41445 1727204182.60829: done checking to see if all hosts have failed 41445 1727204182.60829: getting the remaining hosts for this loop 41445 1727204182.60830: done getting the remaining hosts for this loop 41445 1727204182.60837: getting the next task for host managed-node3 41445 1727204182.60842: done getting next task for host managed-node3 41445 1727204182.60844: ^ task is: TASK: Include the task 'el_repo_setup.yml' 41445 1727204182.60846: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204182.60848: getting variables 41445 1727204182.60849: in VariableManager get_vars() 41445 1727204182.60857: Calling all_inventory to load vars for managed-node3 41445 1727204182.60859: Calling groups_inventory to load vars for managed-node3 41445 1727204182.60861: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204182.60866: Calling all_plugins_play to load vars for managed-node3 41445 1727204182.60868: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204182.60871: Calling groups_plugins_play to load vars for managed-node3 41445 1727204182.61035: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204182.61286: done with get_vars() 41445 1727204182.61293: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tests_route_table_nm.yml:11 Tuesday 24 September 2024 14:56:22 -0400 (0:00:01.389) 0:00:01.401 ***** 41445 1727204182.61371: entering _queue_task() for managed-node3/include_tasks 41445 1727204182.61379: Creating lock for include_tasks 41445 1727204182.61783: worker is 1 (out of 1 available) 41445 1727204182.61796: exiting _queue_task() for managed-node3/include_tasks 41445 1727204182.61805: done queuing things up, now waiting for results queue to drain 41445 1727204182.61807: waiting for pending results... 41445 1727204182.62098: running TaskExecutor() for managed-node3/TASK: Include the task 'el_repo_setup.yml' 41445 1727204182.62237: in run() - task 028d2410-947f-bf02-eee4-000000000006 41445 1727204182.62300: variable 'ansible_search_path' from source: unknown 41445 1727204182.62412: calling self._execute() 41445 1727204182.62472: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204182.62489: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204182.62504: variable 'omit' from source: magic vars 41445 1727204182.62694: _execute() done 41445 1727204182.62703: dumping result to json 41445 1727204182.62856: done dumping result, returning 41445 1727204182.62862: done running TaskExecutor() for managed-node3/TASK: Include the task 'el_repo_setup.yml' [028d2410-947f-bf02-eee4-000000000006] 41445 1727204182.62865: sending task result for task 028d2410-947f-bf02-eee4-000000000006 41445 1727204182.62935: done sending task result for task 028d2410-947f-bf02-eee4-000000000006 41445 1727204182.62938: WORKER PROCESS EXITING 41445 1727204182.63020: no more pending results, returning what we have 41445 1727204182.63028: in VariableManager get_vars() 41445 1727204182.63060: Calling all_inventory to load vars for managed-node3 41445 1727204182.63064: Calling groups_inventory to load vars for managed-node3 41445 1727204182.63245: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204182.63255: Calling all_plugins_play to load vars for managed-node3 41445 1727204182.63258: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204182.63261: Calling groups_plugins_play to load vars for managed-node3 41445 1727204182.63545: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204182.64453: done with get_vars() 41445 1727204182.64461: variable 'ansible_search_path' from source: unknown 41445 1727204182.64474: we have included files to process 41445 1727204182.64478: generating all_blocks data 41445 1727204182.64479: done generating all_blocks data 41445 1727204182.64480: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 41445 1727204182.64481: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 41445 1727204182.64484: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 41445 1727204182.66087: in VariableManager get_vars() 41445 1727204182.66108: done with get_vars() 41445 1727204182.66123: done processing included file 41445 1727204182.66125: iterating over new_blocks loaded from include file 41445 1727204182.66127: in VariableManager get_vars() 41445 1727204182.66135: done with get_vars() 41445 1727204182.66137: filtering new block on tags 41445 1727204182.66197: done filtering new block on tags 41445 1727204182.66201: in VariableManager get_vars() 41445 1727204182.66259: done with get_vars() 41445 1727204182.66261: filtering new block on tags 41445 1727204182.66327: done filtering new block on tags 41445 1727204182.66330: in VariableManager get_vars() 41445 1727204182.66342: done with get_vars() 41445 1727204182.66343: filtering new block on tags 41445 1727204182.66356: done filtering new block on tags 41445 1727204182.66358: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed-node3 41445 1727204182.66438: extending task lists for all hosts with included blocks 41445 1727204182.66495: done extending task lists 41445 1727204182.66496: done processing included files 41445 1727204182.66497: results queue empty 41445 1727204182.66498: checking for any_errors_fatal 41445 1727204182.66499: done checking for any_errors_fatal 41445 1727204182.66500: checking for max_fail_percentage 41445 1727204182.66501: done checking for max_fail_percentage 41445 1727204182.66502: checking to see if all hosts have failed and the running result is not ok 41445 1727204182.66502: done checking to see if all hosts have failed 41445 1727204182.66503: getting the remaining hosts for this loop 41445 1727204182.66504: done getting the remaining hosts for this loop 41445 1727204182.66506: getting the next task for host managed-node3 41445 1727204182.66510: done getting next task for host managed-node3 41445 1727204182.66512: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 41445 1727204182.66514: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204182.66516: getting variables 41445 1727204182.66517: in VariableManager get_vars() 41445 1727204182.66524: Calling all_inventory to load vars for managed-node3 41445 1727204182.66526: Calling groups_inventory to load vars for managed-node3 41445 1727204182.66528: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204182.66533: Calling all_plugins_play to load vars for managed-node3 41445 1727204182.66536: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204182.66570: Calling groups_plugins_play to load vars for managed-node3 41445 1727204182.67005: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204182.67229: done with get_vars() 41445 1727204182.67237: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Tuesday 24 September 2024 14:56:22 -0400 (0:00:00.059) 0:00:01.460 ***** 41445 1727204182.67317: entering _queue_task() for managed-node3/setup 41445 1727204182.67747: worker is 1 (out of 1 available) 41445 1727204182.67758: exiting _queue_task() for managed-node3/setup 41445 1727204182.67768: done queuing things up, now waiting for results queue to drain 41445 1727204182.67769: waiting for pending results... 41445 1727204182.68271: running TaskExecutor() for managed-node3/TASK: Gather the minimum subset of ansible_facts required by the network role test 41445 1727204182.68312: in run() - task 028d2410-947f-bf02-eee4-000000000106 41445 1727204182.68387: variable 'ansible_search_path' from source: unknown 41445 1727204182.68423: variable 'ansible_search_path' from source: unknown 41445 1727204182.68785: calling self._execute() 41445 1727204182.68789: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204182.68791: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204182.68794: variable 'omit' from source: magic vars 41445 1727204182.70969: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41445 1727204182.76608: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41445 1727204182.76681: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41445 1727204182.77075: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41445 1727204182.77081: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41445 1727204182.77084: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41445 1727204182.77443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204182.77491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204182.77633: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204182.77981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204182.77985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204182.78121: variable 'ansible_facts' from source: unknown 41445 1727204182.78310: variable 'network_test_required_facts' from source: task vars 41445 1727204182.78353: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 41445 1727204182.78393: variable 'omit' from source: magic vars 41445 1727204182.78464: variable 'omit' from source: magic vars 41445 1727204182.78652: variable 'omit' from source: magic vars 41445 1727204182.78655: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41445 1727204182.78744: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41445 1727204182.78860: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41445 1727204182.78864: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204182.78867: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204182.78938: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41445 1727204182.78948: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204182.78957: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204182.79183: Set connection var ansible_shell_executable to /bin/sh 41445 1727204182.79186: Set connection var ansible_shell_type to sh 41445 1727204182.79188: Set connection var ansible_pipelining to False 41445 1727204182.79291: Set connection var ansible_timeout to 10 41445 1727204182.79294: Set connection var ansible_connection to ssh 41445 1727204182.79297: Set connection var ansible_module_compression to ZIP_DEFLATED 41445 1727204182.79326: variable 'ansible_shell_executable' from source: unknown 41445 1727204182.79398: variable 'ansible_connection' from source: unknown 41445 1727204182.79402: variable 'ansible_module_compression' from source: unknown 41445 1727204182.79404: variable 'ansible_shell_type' from source: unknown 41445 1727204182.79406: variable 'ansible_shell_executable' from source: unknown 41445 1727204182.79408: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204182.79410: variable 'ansible_pipelining' from source: unknown 41445 1727204182.79415: variable 'ansible_timeout' from source: unknown 41445 1727204182.79424: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204182.79883: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 41445 1727204182.79887: variable 'omit' from source: magic vars 41445 1727204182.79890: starting attempt loop 41445 1727204182.79892: running the handler 41445 1727204182.79894: _low_level_execute_command(): starting 41445 1727204182.79896: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41445 1727204182.81448: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204182.81585: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204182.81681: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204182.81697: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204182.81893: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204182.83482: stdout chunk (state=3): >>>/root <<< 41445 1727204182.83621: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204182.83631: stdout chunk (state=3): >>><<< 41445 1727204182.83643: stderr chunk (state=3): >>><<< 41445 1727204182.83731: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204182.83927: _low_level_execute_command(): starting 41445 1727204182.83930: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204182.8367107-41706-227393699105128 `" && echo ansible-tmp-1727204182.8367107-41706-227393699105128="` echo /root/.ansible/tmp/ansible-tmp-1727204182.8367107-41706-227393699105128 `" ) && sleep 0' 41445 1727204182.85246: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204182.85260: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204182.85280: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204182.85296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204182.85313: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 <<< 41445 1727204182.85323: stderr chunk (state=3): >>>debug2: match not found <<< 41445 1727204182.85341: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204182.85454: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204182.85514: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204182.85692: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204182.85724: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204182.87642: stdout chunk (state=3): >>>ansible-tmp-1727204182.8367107-41706-227393699105128=/root/.ansible/tmp/ansible-tmp-1727204182.8367107-41706-227393699105128 <<< 41445 1727204182.87822: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204182.87825: stdout chunk (state=3): >>><<< 41445 1727204182.87829: stderr chunk (state=3): >>><<< 41445 1727204182.88058: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204182.8367107-41706-227393699105128=/root/.ansible/tmp/ansible-tmp-1727204182.8367107-41706-227393699105128 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204182.88062: variable 'ansible_module_compression' from source: unknown 41445 1727204182.88080: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-414450s0ylvj1/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 41445 1727204182.88233: variable 'ansible_facts' from source: unknown 41445 1727204182.88689: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204182.8367107-41706-227393699105128/AnsiballZ_setup.py 41445 1727204182.88832: Sending initial data 41445 1727204182.88836: Sent initial data (154 bytes) 41445 1727204182.89491: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address <<< 41445 1727204182.89568: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204182.89588: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204182.89599: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204182.89662: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 41445 1727204182.91594: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41445 1727204182.91688: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41445 1727204182.91760: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-414450s0ylvj1/tmpyyeurjw9 /root/.ansible/tmp/ansible-tmp-1727204182.8367107-41706-227393699105128/AnsiballZ_setup.py <<< 41445 1727204182.91763: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204182.8367107-41706-227393699105128/AnsiballZ_setup.py" <<< 41445 1727204182.91767: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-414450s0ylvj1/tmpyyeurjw9" to remote "/root/.ansible/tmp/ansible-tmp-1727204182.8367107-41706-227393699105128/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204182.8367107-41706-227393699105128/AnsiballZ_setup.py" <<< 41445 1727204182.92955: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204182.93006: stderr chunk (state=3): >>><<< 41445 1727204182.93060: stdout chunk (state=3): >>><<< 41445 1727204182.93063: done transferring module to remote 41445 1727204182.93065: _low_level_execute_command(): starting 41445 1727204182.93067: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204182.8367107-41706-227393699105128/ /root/.ansible/tmp/ansible-tmp-1727204182.8367107-41706-227393699105128/AnsiballZ_setup.py && sleep 0' 41445 1727204182.93494: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204182.93499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found <<< 41445 1727204182.93503: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 41445 1727204182.93513: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204182.93516: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204182.93557: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204182.93561: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204182.93608: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 41445 1727204182.95599: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204182.95694: stderr chunk (state=3): >>><<< 41445 1727204182.95698: stdout chunk (state=3): >>><<< 41445 1727204182.95700: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 41445 1727204182.95703: _low_level_execute_command(): starting 41445 1727204182.95705: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204182.8367107-41706-227393699105128/AnsiballZ_setup.py && sleep 0' 41445 1727204182.96919: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204182.97005: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204182.97019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204182.97059: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204182.97084: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204182.97113: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204182.97188: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204182.99526: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook <<< 41445 1727204182.99572: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 41445 1727204182.99622: stdout chunk (state=3): >>>import '_codecs' # <<< 41445 1727204182.99625: stdout chunk (state=3): >>>import 'codecs' # <<< 41445 1727204182.99780: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 41445 1727204182.99784: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 41445 1727204182.99787: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d67a104d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d679dfb30> <<< 41445 1727204182.99789: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 41445 1727204182.99792: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d67a12a50> <<< 41445 1727204182.99794: stdout chunk (state=3): >>>import '_signal' # <<< 41445 1727204182.99897: stdout chunk (state=3): >>>import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # <<< 41445 1727204182.99916: stdout chunk (state=3): >>>import '_collections_abc' # <<< 41445 1727204182.99944: stdout chunk (state=3): >>>import 'genericpath' # <<< 41445 1727204182.99953: stdout chunk (state=3): >>>import 'posixpath' # <<< 41445 1727204182.99973: stdout chunk (state=3): >>>import 'os' # <<< 41445 1727204182.99998: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 41445 1727204183.00029: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' <<< 41445 1727204183.00035: stdout chunk (state=3): >>>Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' <<< 41445 1727204183.00046: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 41445 1727204183.00201: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d677c1130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d677c2060> import 'site' # <<< 41445 1727204183.00226: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 41445 1727204183.00598: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 41445 1727204183.00614: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 41445 1727204183.00678: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 41445 1727204183.00681: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 41445 1727204183.00683: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 41445 1727204183.00784: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 41445 1727204183.00787: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 41445 1727204183.00789: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 41445 1727204183.00791: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d677fff50> <<< 41445 1727204183.01019: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d678140e0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d67837980> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 41445 1727204183.01035: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d67837f50> <<< 41445 1727204183.01042: stdout chunk (state=3): >>>import '_collections' # <<< 41445 1727204183.01092: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d67817c20> <<< 41445 1727204183.01110: stdout chunk (state=3): >>>import '_functools' # <<< 41445 1727204183.01130: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d67815340> <<< 41445 1727204183.01220: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d677fd100> <<< 41445 1727204183.01241: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 41445 1727204183.01325: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 41445 1727204183.01328: stdout chunk (state=3): >>>import '_sre' # <<< 41445 1727204183.01396: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d6785b950> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d6785a570> <<< 41445 1727204183.01419: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py <<< 41445 1727204183.01432: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d67816210> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d67858d70> <<< 41445 1727204183.01484: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 41445 1727204183.01503: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d67888950> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d677fc380> <<< 41445 1727204183.01526: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 41445 1727204183.01563: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d67888e00> <<< 41445 1727204183.01572: stdout chunk (state=3): >>>import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d67888cb0> <<< 41445 1727204183.01602: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 41445 1727204183.01801: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d678890a0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d677faea0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d67889760> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d67889460> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d6788a660> import 'importlib.util' # import 'runpy' # <<< 41445 1727204183.01980: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 41445 1727204183.01983: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 41445 1727204183.01986: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' <<< 41445 1727204183.01989: stdout chunk (state=3): >>>import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d678a4860> <<< 41445 1727204183.01991: stdout chunk (state=3): >>>import 'errno' # <<< 41445 1727204183.01994: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 41445 1727204183.01996: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d678a5fa0> <<< 41445 1727204183.01998: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 41445 1727204183.02000: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 41445 1727204183.02005: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py <<< 41445 1727204183.02017: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 41445 1727204183.02026: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d678a6e40> <<< 41445 1727204183.02199: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d678a74a0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d678a6390> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d678a7f20> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d678a7650> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d6788a690> <<< 41445 1727204183.02217: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 41445 1727204183.02241: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 41445 1727204183.02260: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 41445 1727204183.02282: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 41445 1727204183.02311: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d6759bda0> <<< 41445 1727204183.02340: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py <<< 41445 1727204183.02352: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 41445 1727204183.02460: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' <<< 41445 1727204183.02465: stdout chunk (state=3): >>># extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d675c48f0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d675c4650> <<< 41445 1727204183.02468: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d675c4920> <<< 41445 1727204183.02470: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py <<< 41445 1727204183.02587: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 41445 1727204183.02598: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 41445 1727204183.02633: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d675c5250><<< 41445 1727204183.02642: stdout chunk (state=3): >>> <<< 41445 1727204183.02748: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 41445 1727204183.02763: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d675c5c40> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d675c4b00> <<< 41445 1727204183.02789: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d67599f40> <<< 41445 1727204183.02799: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 41445 1727204183.02899: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d675c7050> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d675c5d90> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d6788ad80> <<< 41445 1727204183.02929: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 41445 1727204183.02991: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 41445 1727204183.03006: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 41445 1727204183.03039: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 41445 1727204183.03114: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d675ef3e0> <<< 41445 1727204183.03197: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 41445 1727204183.03223: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d676137a0> <<< 41445 1727204183.03237: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 41445 1727204183.03284: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 41445 1727204183.03346: stdout chunk (state=3): >>>import 'ntpath' # <<< 41445 1727204183.03442: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d67670590> <<< 41445 1727204183.03495: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 41445 1727204183.03561: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d67672cf0> <<< 41445 1727204183.03633: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d676706b0> <<< 41445 1727204183.03668: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d676395b0> <<< 41445 1727204183.03695: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66f25700> <<< 41445 1727204183.03794: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d676125a0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d675c7fb0> <<< 41445 1727204183.03889: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 41445 1727204183.03908: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f4d67612900> <<< 41445 1727204183.04181: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_mejrwmi9/ansible_setup_payload.zip' # zipimport: zlib available <<< 41445 1727204183.04305: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.04328: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 41445 1727204183.04394: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 41445 1727204183.04539: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 41445 1727204183.04543: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66f8b470> <<< 41445 1727204183.04545: stdout chunk (state=3): >>>import '_typing' # <<< 41445 1727204183.04884: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66f6e360> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66f6d4c0> <<< 41445 1727204183.04897: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.04906: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available <<< 41445 1727204183.06180: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.07317: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' <<< 41445 1727204183.07337: stdout chunk (state=3): >>>import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66f89340> <<< 41445 1727204183.07350: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 41445 1727204183.07382: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 41445 1727204183.07385: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 41445 1727204183.07443: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 41445 1727204183.07446: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' <<< 41445 1727204183.07780: stdout chunk (state=3): >>>import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d66fbad80> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66fbab10> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66fba420> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66fba870> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d67a129c0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' <<< 41445 1727204183.07784: stdout chunk (state=3): >>># extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d66fbbad0> <<< 41445 1727204183.07787: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' <<< 41445 1727204183.07789: stdout chunk (state=3): >>># extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d66fbbcb0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 41445 1727204183.07798: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 41445 1727204183.07800: stdout chunk (state=3): >>>import '_locale' # <<< 41445 1727204183.07900: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66fe41d0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66e29fa0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d66e2bbc0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 41445 1727204183.07919: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 41445 1727204183.07950: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66e2c5c0> <<< 41445 1727204183.07961: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 41445 1727204183.08032: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 41445 1727204183.08038: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66e2d760> <<< 41445 1727204183.08041: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 41445 1727204183.08208: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66e341d0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d66e34320> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66e2e480> <<< 41445 1727204183.08230: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 41445 1727204183.08361: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 41445 1727204183.08397: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 41445 1727204183.08425: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66e37fb0> <<< 41445 1727204183.08477: stdout chunk (state=3): >>>import '_tokenize' # <<< 41445 1727204183.08687: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66e36ab0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66e36840> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 41445 1727204183.08690: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66e36d50> <<< 41445 1727204183.08693: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66e2e990> <<< 41445 1727204183.08695: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' <<< 41445 1727204183.08697: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d66e7c140> <<< 41445 1727204183.08817: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66e7c320> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d66e7de20> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66e7dbe0> <<< 41445 1727204183.08821: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 41445 1727204183.08904: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 41445 1727204183.08908: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d66e80380> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66e7e510> <<< 41445 1727204183.08928: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 41445 1727204183.08964: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 41445 1727204183.08985: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 41445 1727204183.09195: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66e83b60> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66e80530> <<< 41445 1727204183.09230: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d66e84bc0> <<< 41445 1727204183.09259: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d66e84c50> <<< 41445 1727204183.09301: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d66e84e00> <<< 41445 1727204183.09316: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66e7c500> <<< 41445 1727204183.09336: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py <<< 41445 1727204183.09354: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 41445 1727204183.09369: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 41445 1727204183.09381: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 41445 1727204183.09405: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 41445 1727204183.09601: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d66d104d0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d66d11970> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66e86c60> <<< 41445 1727204183.09625: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d66e87860> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66e86870> <<< 41445 1727204183.09639: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.09666: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # <<< 41445 1727204183.09671: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.09754: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.09910: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.09913: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # <<< 41445 1727204183.09916: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.09918: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 41445 1727204183.10021: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.10283: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.10657: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.11189: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 41445 1727204183.11232: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 41445 1727204183.11302: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 41445 1727204183.11318: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d66d15b20> <<< 41445 1727204183.11384: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py <<< 41445 1727204183.11387: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66d16960> <<< 41445 1727204183.11507: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66d11dc0> <<< 41445 1727204183.11512: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available <<< 41445 1727204183.11713: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.11788: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 41445 1727204183.11941: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66d16a20> # zipimport: zlib available <<< 41445 1727204183.12471: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.12953: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 41445 1727204183.12968: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # # zipimport: zlib available <<< 41445 1727204183.13033: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.13058: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 41445 1727204183.13092: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.13250: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.13306: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 41445 1727204183.13312: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 41445 1727204183.13412: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available <<< 41445 1727204183.13446: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 41445 1727204183.13600: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.13883: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.14211: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 41445 1727204183.14290: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 41445 1727204183.14294: stdout chunk (state=3): >>>import '_ast' # <<< 41445 1727204183.14447: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66d17c80> # zipimport: zlib available <<< 41445 1727204183.14540: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.14626: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 41445 1727204183.14668: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 41445 1727204183.14728: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 41445 1727204183.14732: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.14788: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.14880: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.14924: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.15062: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 41445 1727204183.15080: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 41445 1727204183.15292: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d66d225d0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66d1de50> <<< 41445 1727204183.15308: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 41445 1727204183.15393: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.15571: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 41445 1727204183.15631: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 41445 1727204183.15735: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 41445 1727204183.15741: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 41445 1727204183.16077: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66e0adb0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66feaab0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66d22780> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66d10410> # destroy ansible.module_utils.distro <<< 41445 1727204183.16292: stdout chunk (state=3): >>>import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 41445 1727204183.16315: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 41445 1727204183.16485: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # <<< 41445 1727204183.16488: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.16581: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.16586: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.16766: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 41445 1727204183.16920: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.17084: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.17087: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.17095: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py <<< 41445 1727204183.17098: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 41445 1727204183.17146: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 41445 1727204183.17160: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 41445 1727204183.17180: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66db2b70> <<< 41445 1727204183.17303: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 41445 1727204183.17531: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d669cc500> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d669cc8f0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66da0740> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66db36b0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66db1220> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66db0d70> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 41445 1727204183.17579: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 41445 1727204183.17613: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d669cf7d0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d669cf080> <<< 41445 1727204183.17705: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d669cf260> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d669ce4e0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 41445 1727204183.17963: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d669cf980> <<< 41445 1727204183.17966: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 41445 1727204183.18199: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d66a2e480> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66a2c4a0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66db0f20> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available <<< 41445 1727204183.18451: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available <<< 41445 1727204183.18651: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # <<< 41445 1727204183.18655: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.18657: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.18710: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 41445 1727204183.18713: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.18774: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.18869: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.18950: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.19032: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available <<< 41445 1727204183.19982: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.20716: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 41445 1727204183.20777: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available <<< 41445 1727204183.20808: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.20855: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available <<< 41445 1727204183.21001: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # <<< 41445 1727204183.21023: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.21066: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.21104: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 41445 1727204183.21289: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.21305: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available <<< 41445 1727204183.21427: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 41445 1727204183.21762: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66a2f9b0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 41445 1727204183.21765: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66a2f290> <<< 41445 1727204183.21768: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available <<< 41445 1727204183.21804: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.21895: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 41445 1727204183.21906: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.22038: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.22168: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 41445 1727204183.22188: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.22364: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # <<< 41445 1727204183.22428: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 41445 1727204183.22516: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 41445 1727204183.22596: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 41445 1727204183.22810: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d66a6a780> <<< 41445 1727204183.23045: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66a5a4b0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available <<< 41445 1727204183.23188: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # <<< 41445 1727204183.23207: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.23400: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.23443: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.23826: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.23829: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # <<< 41445 1727204183.23850: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available <<< 41445 1727204183.23932: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.23954: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 41445 1727204183.23967: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.24017: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.24069: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py <<< 41445 1727204183.24128: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 41445 1727204183.24154: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d66a7e150> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66a7dd60> import 'ansible.module_utils.facts.system.user' # <<< 41445 1727204183.24234: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # <<< 41445 1727204183.24255: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.24294: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available <<< 41445 1727204183.24741: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.24764: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available <<< 41445 1727204183.24897: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.24991: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available <<< 41445 1727204183.25009: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 41445 1727204183.25167: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.25671: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 41445 1727204183.26132: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.26908: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available <<< 41445 1727204183.26996: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.27292: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available <<< 41445 1727204183.27299: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.27445: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 41445 1727204183.27498: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.27773: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.27932: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 41445 1727204183.27954: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 41445 1727204183.27990: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.28080: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.base' # <<< 41445 1727204183.28098: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.28228: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.28379: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.28680: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.28941: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # <<< 41445 1727204183.28994: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available <<< 41445 1727204183.29201: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.29317: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available <<< 41445 1727204183.29513: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available <<< 41445 1727204183.29640: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available <<< 41445 1727204183.29686: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available <<< 41445 1727204183.29994: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.30345: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available <<< 41445 1727204183.30349: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 41445 1727204183.30351: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.30537: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available <<< 41445 1727204183.30557: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # <<< 41445 1727204183.30600: stdout chunk (state=3): >>> # zipimport: zlib available <<< 41445 1727204183.30714: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.30732: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 41445 1727204183.30751: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 41445 1727204183.30797: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available <<< 41445 1727204183.30939: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available <<< 41445 1727204183.30943: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.31057: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 41445 1727204183.31219: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available <<< 41445 1727204183.31223: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.31269: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # <<< 41445 1727204183.31285: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.31677: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.31682: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 41445 1727204183.31684: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.31725: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.31773: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 41445 1727204183.31778: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.31864: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.32062: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available <<< 41445 1727204183.32187: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 41445 1727204183.32228: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.32257: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.32413: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # <<< 41445 1727204183.32426: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 41445 1727204183.32522: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.33926: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 41445 1727204183.34011: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' <<< 41445 1727204183.34024: stdout chunk (state=3): >>># extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d6687bce0> <<< 41445 1727204183.34036: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d668783e0> <<< 41445 1727204183.34157: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66878c20> <<< 41445 1727204183.34615: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC83yKiaGAYjfqsqlfpPMYFAYI2IZVpX8WgNfvPCdI/QOKjuzs4b6SbV/Bm0ogtp9Et9STTGoHBvp3tMYQ6i0y/9DHMBxtiHYJ+rIzJ/YNNMqKc+fMqexyOsi7rKKuzfxXrYU8uPBHq5WU3WAZmJxJn+UHZGog8hUnJ8momdJG+aYo9El3Qce4gVdwORcmHZUOa49M8lLCwTovtYArmkGETUVJ+Jk8huVTzYpASWxxcw6zOvUcn52HC6dmNQv/T+k2uW6UW0rybwIrVUlZXRNODrXs8kCGgOx1OI0XYB3FndJOnORF4A9Y6onLo/zUCEaO8Pi19mcfSbo2v+bmotTVk5jcmvR3jhVYJmJE6a+dQpjSZolSqMv8mI9tkztfxM6bJlNSZcTrvZEzu7cbiE38Pp/Ku143n9iGgWNmUQ2FhUDpoWEhoA767VUunE48P8ivpVZ/u5aEyupZSLEuWEvCLmmGPVcf9hVbcXw0n8RFvUwDdD8WARVhN5GJFUVN5JM0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHm1sMDuVWGACN5pHFDkl6tR93F90YCY4cFGcXcCoQnN+oT963FmBwTMMlfDIm4G2OUATCZuz6QFZP9trAaUzXo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIAB71QKijTWbanEvrb0ex0kLr0wX6qyv6naldRWNiIFP", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "56", "second": "23", "epoch": "1727204183", "epoch_int": "1727204183", "date": "2024-09-24", "time": "14:56:23", "iso8601_micro": "2024-09-24T18:56:23.326710Z", "iso8601": "2024-09-24T18:56:23Z", "iso8601_basic": "20240924T145623326710", "iso8601_basic_short": "20240924T145623", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_apparmor": {"status": "disabled"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fips": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 53526 10.31.47.22 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 53526 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro"<<< 41445 1727204183.34635: stdout chunk (state=3): >>>: true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_local": {}, "ansible_lsb": {}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec25272c602494034078bc876e25857f", "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 41445 1727204183.35651: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 <<< 41445 1727204183.35756: stdout chunk (state=3): >>># cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dr<<< 41445 1727204183.35792: stdout chunk (state=3): >>>agonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 41445 1727204183.35985: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 41445 1727204183.36007: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 41445 1727204183.36035: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma <<< 41445 1727204183.36079: stdout chunk (state=3): >>># destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob <<< 41445 1727204183.36115: stdout chunk (state=3): >>># destroy ipaddress <<< 41445 1727204183.36133: stdout chunk (state=3): >>># destroy ntpath <<< 41445 1727204183.36205: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid <<< 41445 1727204183.36243: stdout chunk (state=3): >>># destroy selinux # destroy shutil <<< 41445 1727204183.36272: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 41445 1727204183.36330: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection <<< 41445 1727204183.36370: stdout chunk (state=3): >>># destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle <<< 41445 1727204183.36477: stdout chunk (state=3): >>># destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl <<< 41445 1727204183.36486: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd <<< 41445 1727204183.36498: stdout chunk (state=3): >>># destroy termios # destroy errno <<< 41445 1727204183.36533: stdout chunk (state=3): >>># destroy json <<< 41445 1727204183.36612: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime <<< 41445 1727204183.36620: stdout chunk (state=3): >>># cleanup[3] wiping traceback # destroy linecache # destroy textwrap <<< 41445 1727204183.36656: stdout chunk (state=3): >>># cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform <<< 41445 1727204183.36660: stdout chunk (state=3): >>># cleanup[3] wiping atexit <<< 41445 1727204183.36679: stdout chunk (state=3): >>># cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading <<< 41445 1727204183.36719: stdout chunk (state=3): >>># cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings <<< 41445 1727204183.36724: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix <<< 41445 1727204183.36727: stdout chunk (state=3): >>># destroy re._compiler # destroy enum # cleanup[3] wiping copyreg <<< 41445 1727204183.36764: stdout chunk (state=3): >>># cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os <<< 41445 1727204183.36770: stdout chunk (state=3): >>># destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat <<< 41445 1727204183.36897: stdout chunk (state=3): >>># destroy _stat <<< 41445 1727204183.36904: stdout chunk (state=3): >>># cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 41445 1727204183.37026: stdout chunk (state=3): >>># destroy sys.monitoring <<< 41445 1727204183.37033: stdout chunk (state=3): >>># destroy _socket <<< 41445 1727204183.37052: stdout chunk (state=3): >>># destroy _collections <<< 41445 1727204183.37084: stdout chunk (state=3): >>># destroy platform <<< 41445 1727204183.37095: stdout chunk (state=3): >>># destroy _uuid # destroy stat # destroy genericpath # destroy re._parser <<< 41445 1727204183.37104: stdout chunk (state=3): >>># destroy tokenize <<< 41445 1727204183.37166: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize<<< 41445 1727204183.37238: stdout chunk (state=3): >>> <<< 41445 1727204183.37241: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser <<< 41445 1727204183.37279: stdout chunk (state=3): >>># destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 41445 1727204183.37304: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 41445 1727204183.37419: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases <<< 41445 1727204183.37424: stdout chunk (state=3): >>># destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs <<< 41445 1727204183.37486: stdout chunk (state=3): >>># destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time <<< 41445 1727204183.37489: stdout chunk (state=3): >>># destroy _random <<< 41445 1727204183.37495: stdout chunk (state=3): >>># destroy _weakref <<< 41445 1727204183.37522: stdout chunk (state=3): >>># destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re <<< 41445 1727204183.37562: stdout chunk (state=3): >>># destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread <<< 41445 1727204183.37579: stdout chunk (state=3): >>># clear sys.audit hooks <<< 41445 1727204183.38011: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. <<< 41445 1727204183.38047: stderr chunk (state=3): >>><<< 41445 1727204183.38049: stdout chunk (state=3): >>><<< 41445 1727204183.38154: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d67a104d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d679dfb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d67a12a50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d677c1130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d677c2060> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d677fff50> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d678140e0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d67837980> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d67837f50> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d67817c20> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d67815340> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d677fd100> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d6785b950> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d6785a570> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d67816210> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d67858d70> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d67888950> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d677fc380> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d67888e00> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d67888cb0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d678890a0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d677faea0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d67889760> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d67889460> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d6788a660> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d678a4860> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d678a5fa0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d678a6e40> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d678a74a0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d678a6390> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d678a7f20> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d678a7650> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d6788a690> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d6759bda0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d675c48f0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d675c4650> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d675c4920> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d675c5250> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d675c5c40> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d675c4b00> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d67599f40> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d675c7050> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d675c5d90> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d6788ad80> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d675ef3e0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d676137a0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d67670590> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d67672cf0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d676706b0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d676395b0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66f25700> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d676125a0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d675c7fb0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f4d67612900> # zipimport: found 103 names in '/tmp/ansible_setup_payload_mejrwmi9/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66f8b470> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66f6e360> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66f6d4c0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66f89340> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d66fbad80> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66fbab10> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66fba420> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66fba870> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d67a129c0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d66fbbad0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d66fbbcb0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66fe41d0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66e29fa0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d66e2bbc0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66e2c5c0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66e2d760> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66e341d0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d66e34320> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66e2e480> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66e37fb0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66e36ab0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66e36840> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66e36d50> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66e2e990> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d66e7c140> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66e7c320> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d66e7de20> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66e7dbe0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d66e80380> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66e7e510> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66e83b60> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66e80530> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d66e84bc0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d66e84c50> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d66e84e00> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66e7c500> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d66d104d0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d66d11970> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66e86c60> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d66e87860> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66e86870> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d66d15b20> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66d16960> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66d11dc0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66d16a20> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66d17c80> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d66d225d0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66d1de50> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66e0adb0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66feaab0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66d22780> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66d10410> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66db2b70> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d669cc500> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d669cc8f0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66da0740> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66db36b0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66db1220> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66db0d70> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d669cf7d0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d669cf080> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d669cf260> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d669ce4e0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d669cf980> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d66a2e480> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66a2c4a0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66db0f20> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66a2f9b0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66a2f290> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d66a6a780> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66a5a4b0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d66a7e150> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66a7dd60> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d6687bce0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d668783e0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d66878c20> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC83yKiaGAYjfqsqlfpPMYFAYI2IZVpX8WgNfvPCdI/QOKjuzs4b6SbV/Bm0ogtp9Et9STTGoHBvp3tMYQ6i0y/9DHMBxtiHYJ+rIzJ/YNNMqKc+fMqexyOsi7rKKuzfxXrYU8uPBHq5WU3WAZmJxJn+UHZGog8hUnJ8momdJG+aYo9El3Qce4gVdwORcmHZUOa49M8lLCwTovtYArmkGETUVJ+Jk8huVTzYpASWxxcw6zOvUcn52HC6dmNQv/T+k2uW6UW0rybwIrVUlZXRNODrXs8kCGgOx1OI0XYB3FndJOnORF4A9Y6onLo/zUCEaO8Pi19mcfSbo2v+bmotTVk5jcmvR3jhVYJmJE6a+dQpjSZolSqMv8mI9tkztfxM6bJlNSZcTrvZEzu7cbiE38Pp/Ku143n9iGgWNmUQ2FhUDpoWEhoA767VUunE48P8ivpVZ/u5aEyupZSLEuWEvCLmmGPVcf9hVbcXw0n8RFvUwDdD8WARVhN5GJFUVN5JM0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHm1sMDuVWGACN5pHFDkl6tR93F90YCY4cFGcXcCoQnN+oT963FmBwTMMlfDIm4G2OUATCZuz6QFZP9trAaUzXo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIAB71QKijTWbanEvrb0ex0kLr0wX6qyv6naldRWNiIFP", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "56", "second": "23", "epoch": "1727204183", "epoch_int": "1727204183", "date": "2024-09-24", "time": "14:56:23", "iso8601_micro": "2024-09-24T18:56:23.326710Z", "iso8601": "2024-09-24T18:56:23Z", "iso8601_basic": "20240924T145623326710", "iso8601_basic_short": "20240924T145623", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_apparmor": {"status": "disabled"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fips": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 53526 10.31.47.22 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 53526 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_local": {}, "ansible_lsb": {}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec25272c602494034078bc876e25857f", "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 41445 1727204183.38986: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204182.8367107-41706-227393699105128/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41445 1727204183.38989: _low_level_execute_command(): starting 41445 1727204183.38992: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204182.8367107-41706-227393699105128/ > /dev/null 2>&1 && sleep 0' 41445 1727204183.39248: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204183.39252: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204183.39255: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204183.39257: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204183.39268: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 <<< 41445 1727204183.39280: stderr chunk (state=3): >>>debug2: match not found <<< 41445 1727204183.39298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204183.39301: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41445 1727204183.39317: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.22 is address <<< 41445 1727204183.39321: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41445 1727204183.39323: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204183.39338: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204183.39390: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204183.39411: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 <<< 41445 1727204183.39477: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204183.39514: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204183.39536: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204183.39559: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204183.39648: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204183.42132: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204183.42161: stderr chunk (state=3): >>><<< 41445 1727204183.42165: stdout chunk (state=3): >>><<< 41445 1727204183.42190: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204183.42383: handler run complete 41445 1727204183.42386: variable 'ansible_facts' from source: unknown 41445 1727204183.42389: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204183.42443: variable 'ansible_facts' from source: unknown 41445 1727204183.42508: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204183.42578: attempt loop complete, returning result 41445 1727204183.42586: _execute() done 41445 1727204183.42593: dumping result to json 41445 1727204183.42626: done dumping result, returning 41445 1727204183.42638: done running TaskExecutor() for managed-node3/TASK: Gather the minimum subset of ansible_facts required by the network role test [028d2410-947f-bf02-eee4-000000000106] 41445 1727204183.42646: sending task result for task 028d2410-947f-bf02-eee4-000000000106 ok: [managed-node3] 41445 1727204183.42947: no more pending results, returning what we have 41445 1727204183.42950: results queue empty 41445 1727204183.42951: checking for any_errors_fatal 41445 1727204183.42952: done checking for any_errors_fatal 41445 1727204183.42952: checking for max_fail_percentage 41445 1727204183.42954: done checking for max_fail_percentage 41445 1727204183.42954: checking to see if all hosts have failed and the running result is not ok 41445 1727204183.42955: done checking to see if all hosts have failed 41445 1727204183.42956: getting the remaining hosts for this loop 41445 1727204183.42957: done getting the remaining hosts for this loop 41445 1727204183.42960: getting the next task for host managed-node3 41445 1727204183.42968: done getting next task for host managed-node3 41445 1727204183.42970: ^ task is: TASK: Check if system is ostree 41445 1727204183.42973: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204183.42978: getting variables 41445 1727204183.42986: in VariableManager get_vars() 41445 1727204183.43016: Calling all_inventory to load vars for managed-node3 41445 1727204183.43019: Calling groups_inventory to load vars for managed-node3 41445 1727204183.43023: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204183.43034: Calling all_plugins_play to load vars for managed-node3 41445 1727204183.43037: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204183.43040: Calling groups_plugins_play to load vars for managed-node3 41445 1727204183.43064: done sending task result for task 028d2410-947f-bf02-eee4-000000000106 41445 1727204183.43068: WORKER PROCESS EXITING 41445 1727204183.43290: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204183.43464: done with get_vars() 41445 1727204183.43475: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Tuesday 24 September 2024 14:56:23 -0400 (0:00:00.762) 0:00:02.223 ***** 41445 1727204183.43571: entering _queue_task() for managed-node3/stat 41445 1727204183.43850: worker is 1 (out of 1 available) 41445 1727204183.43862: exiting _queue_task() for managed-node3/stat 41445 1727204183.43873: done queuing things up, now waiting for results queue to drain 41445 1727204183.43874: waiting for pending results... 41445 1727204183.44144: running TaskExecutor() for managed-node3/TASK: Check if system is ostree 41445 1727204183.44296: in run() - task 028d2410-947f-bf02-eee4-000000000108 41445 1727204183.44299: variable 'ansible_search_path' from source: unknown 41445 1727204183.44303: variable 'ansible_search_path' from source: unknown 41445 1727204183.44350: calling self._execute() 41445 1727204183.44417: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204183.44421: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204183.44431: variable 'omit' from source: magic vars 41445 1727204183.44790: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41445 1727204183.44960: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41445 1727204183.44995: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41445 1727204183.45022: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41445 1727204183.45047: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41445 1727204183.45117: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41445 1727204183.45132: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41445 1727204183.45150: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204183.45167: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41445 1727204183.45257: Evaluated conditional (not __network_is_ostree is defined): True 41445 1727204183.45261: variable 'omit' from source: magic vars 41445 1727204183.45289: variable 'omit' from source: magic vars 41445 1727204183.45314: variable 'omit' from source: magic vars 41445 1727204183.45333: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41445 1727204183.45360: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41445 1727204183.45373: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41445 1727204183.45388: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204183.45396: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204183.45420: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41445 1727204183.45423: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204183.45426: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204183.45496: Set connection var ansible_shell_executable to /bin/sh 41445 1727204183.45500: Set connection var ansible_shell_type to sh 41445 1727204183.45503: Set connection var ansible_pipelining to False 41445 1727204183.45510: Set connection var ansible_timeout to 10 41445 1727204183.45515: Set connection var ansible_connection to ssh 41445 1727204183.45521: Set connection var ansible_module_compression to ZIP_DEFLATED 41445 1727204183.45556: variable 'ansible_shell_executable' from source: unknown 41445 1727204183.45567: variable 'ansible_connection' from source: unknown 41445 1727204183.45570: variable 'ansible_module_compression' from source: unknown 41445 1727204183.45573: variable 'ansible_shell_type' from source: unknown 41445 1727204183.45577: variable 'ansible_shell_executable' from source: unknown 41445 1727204183.45580: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204183.45582: variable 'ansible_pipelining' from source: unknown 41445 1727204183.45584: variable 'ansible_timeout' from source: unknown 41445 1727204183.45588: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204183.45689: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 41445 1727204183.45697: variable 'omit' from source: magic vars 41445 1727204183.45702: starting attempt loop 41445 1727204183.45705: running the handler 41445 1727204183.45717: _low_level_execute_command(): starting 41445 1727204183.45725: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41445 1727204183.46243: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found <<< 41445 1727204183.46248: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found <<< 41445 1727204183.46251: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204183.46307: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204183.46385: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204183.46395: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204183.48664: stdout chunk (state=3): >>>/root <<< 41445 1727204183.48811: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204183.48834: stderr chunk (state=3): >>><<< 41445 1727204183.48838: stdout chunk (state=3): >>><<< 41445 1727204183.48858: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204183.48872: _low_level_execute_command(): starting 41445 1727204183.48885: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204183.4885695-41816-8571812411502 `" && echo ansible-tmp-1727204183.4885695-41816-8571812411502="` echo /root/.ansible/tmp/ansible-tmp-1727204183.4885695-41816-8571812411502 `" ) && sleep 0' 41445 1727204183.49335: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204183.49338: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204183.49340: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration <<< 41445 1727204183.49343: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found <<< 41445 1727204183.49345: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204183.49393: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204183.49396: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204183.49443: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204183.52107: stdout chunk (state=3): >>>ansible-tmp-1727204183.4885695-41816-8571812411502=/root/.ansible/tmp/ansible-tmp-1727204183.4885695-41816-8571812411502 <<< 41445 1727204183.52387: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204183.52390: stderr chunk (state=3): >>><<< 41445 1727204183.52393: stdout chunk (state=3): >>><<< 41445 1727204183.52395: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204183.4885695-41816-8571812411502=/root/.ansible/tmp/ansible-tmp-1727204183.4885695-41816-8571812411502 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204183.52402: variable 'ansible_module_compression' from source: unknown 41445 1727204183.52457: ANSIBALLZ: Using lock for stat 41445 1727204183.52465: ANSIBALLZ: Acquiring lock 41445 1727204183.52473: ANSIBALLZ: Lock acquired: 140182283769408 41445 1727204183.52484: ANSIBALLZ: Creating module 41445 1727204183.63658: ANSIBALLZ: Writing module into payload 41445 1727204183.63757: ANSIBALLZ: Writing module 41445 1727204183.63785: ANSIBALLZ: Renaming module 41445 1727204183.63797: ANSIBALLZ: Done creating module 41445 1727204183.63823: variable 'ansible_facts' from source: unknown 41445 1727204183.63897: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204183.4885695-41816-8571812411502/AnsiballZ_stat.py 41445 1727204183.64129: Sending initial data 41445 1727204183.64138: Sent initial data (151 bytes) 41445 1727204183.64738: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204183.64750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204183.64846: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204183.64861: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204183.64884: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204183.64960: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204183.66455: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41445 1727204183.66500: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41445 1727204183.66569: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-414450s0ylvj1/tmpd2425wgr /root/.ansible/tmp/ansible-tmp-1727204183.4885695-41816-8571812411502/AnsiballZ_stat.py <<< 41445 1727204183.66572: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204183.4885695-41816-8571812411502/AnsiballZ_stat.py" <<< 41445 1727204183.66612: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-414450s0ylvj1/tmpd2425wgr" to remote "/root/.ansible/tmp/ansible-tmp-1727204183.4885695-41816-8571812411502/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204183.4885695-41816-8571812411502/AnsiballZ_stat.py" <<< 41445 1727204183.67327: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204183.67374: stderr chunk (state=3): >>><<< 41445 1727204183.67388: stdout chunk (state=3): >>><<< 41445 1727204183.67444: done transferring module to remote 41445 1727204183.67562: _low_level_execute_command(): starting 41445 1727204183.67569: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204183.4885695-41816-8571812411502/ /root/.ansible/tmp/ansible-tmp-1727204183.4885695-41816-8571812411502/AnsiballZ_stat.py && sleep 0' 41445 1727204183.68129: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204183.68143: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204183.68161: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204183.68231: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204183.69980: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204183.70017: stdout chunk (state=3): >>><<< 41445 1727204183.70021: stderr chunk (state=3): >>><<< 41445 1727204183.70120: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204183.70123: _low_level_execute_command(): starting 41445 1727204183.70125: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204183.4885695-41816-8571812411502/AnsiballZ_stat.py && sleep 0' 41445 1727204183.70636: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204183.70646: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204183.70658: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204183.70673: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204183.70688: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 <<< 41445 1727204183.70718: stderr chunk (state=3): >>>debug2: match not found <<< 41445 1727204183.70721: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204183.70724: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41445 1727204183.70726: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.22 is address <<< 41445 1727204183.70821: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41445 1727204183.70825: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204183.70828: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204183.70830: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204183.70832: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 <<< 41445 1727204183.70834: stderr chunk (state=3): >>>debug2: match found <<< 41445 1727204183.70836: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204183.70838: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204183.70855: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204183.70881: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204183.70934: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204183.73033: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 41445 1727204183.73065: stdout chunk (state=3): >>>import _imp # builtin <<< 41445 1727204183.73068: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 41445 1727204183.73143: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 41445 1727204183.73173: stdout chunk (state=3): >>>import 'posix' # <<< 41445 1727204183.73204: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 41445 1727204183.73242: stdout chunk (state=3): >>>import 'time' # <<< 41445 1727204183.73257: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 41445 1727204183.73291: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 41445 1727204183.73310: stdout chunk (state=3): >>>import '_codecs' # <<< 41445 1727204183.73335: stdout chunk (state=3): >>>import 'codecs' # <<< 41445 1727204183.73377: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 41445 1727204183.73405: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfee84d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfeb7b30> <<< 41445 1727204183.73442: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 41445 1727204183.73458: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfeeaa50> <<< 41445 1727204183.73514: stdout chunk (state=3): >>>import '_signal' # import '_abc' # <<< 41445 1727204183.73524: stdout chunk (state=3): >>>import 'abc' # import 'io' # <<< 41445 1727204183.73552: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 41445 1727204183.73634: stdout chunk (state=3): >>>import '_collections_abc' # <<< 41445 1727204183.73662: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 41445 1727204183.73724: stdout chunk (state=3): >>>import 'os' # <<< 41445 1727204183.73728: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 41445 1727204183.73759: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 41445 1727204183.73801: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 41445 1727204183.73815: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfc99130> <<< 41445 1727204183.73886: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 41445 1727204183.73889: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 41445 1727204183.73919: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfc9a060> <<< 41445 1727204183.73922: stdout chunk (state=3): >>>import 'site' # <<< 41445 1727204183.73955: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 41445 1727204183.74196: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 41445 1727204183.74218: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 41445 1727204183.74236: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 41445 1727204183.74279: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 41445 1727204183.74320: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 41445 1727204183.74353: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfcd7f50> <<< 41445 1727204183.74380: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 41445 1727204183.74384: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 41445 1727204183.74445: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfcec0e0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 41445 1727204183.74468: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 41445 1727204183.74472: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 41445 1727204183.74536: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 41445 1727204183.74561: stdout chunk (state=3): >>>import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfd0f980> <<< 41445 1727204183.74598: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' <<< 41445 1727204183.74611: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfd0ff50> import '_collections' # <<< 41445 1727204183.74668: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfcefc20> <<< 41445 1727204183.74671: stdout chunk (state=3): >>>import '_functools' # <<< 41445 1727204183.74696: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfced340> <<< 41445 1727204183.74787: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfcd5100> <<< 41445 1727204183.74805: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 41445 1727204183.74836: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 41445 1727204183.74891: stdout chunk (state=3): >>>import '_sre' # <<< 41445 1727204183.74900: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 41445 1727204183.74920: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 41445 1727204183.74987: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfd33950> <<< 41445 1727204183.74990: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfd32570> <<< 41445 1727204183.75033: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfcee210> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfd30d70> <<< 41445 1727204183.75065: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 41445 1727204183.75072: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfd60950> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfcd4380> <<< 41445 1727204183.75104: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 41445 1727204183.75117: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 41445 1727204183.75448: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f46bfd60e00> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfd60cb0> <<< 41445 1727204183.75453: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f46bfd610a0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfcd2ea0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfd61760> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfd61460> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfd62660> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' <<< 41445 1727204183.75455: stdout chunk (state=3): >>>import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfd7c860> <<< 41445 1727204183.75456: stdout chunk (state=3): >>>import 'errno' # <<< 41445 1727204183.75480: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 41445 1727204183.75492: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 41445 1727204183.75497: stdout chunk (state=3): >>>import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f46bfd7dfa0> <<< 41445 1727204183.75522: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 41445 1727204183.75555: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py <<< 41445 1727204183.75572: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 41445 1727204183.75578: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfd7ee40> <<< 41445 1727204183.75621: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f46bfd7f4a0> <<< 41445 1727204183.75636: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfd7e390> <<< 41445 1727204183.75657: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 41445 1727204183.75678: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 41445 1727204183.75713: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 41445 1727204183.75719: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f46bfd7ff20> <<< 41445 1727204183.75789: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfd7f650> <<< 41445 1727204183.75795: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfd62690> <<< 41445 1727204183.75898: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 41445 1727204183.75938: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' <<< 41445 1727204183.75943: stdout chunk (state=3): >>># extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f46bfb13da0> <<< 41445 1727204183.75974: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 41445 1727204183.76041: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f46bfb3c8f0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfb3c650> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f46bfb3c920> <<< 41445 1727204183.76087: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py <<< 41445 1727204183.76093: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 41445 1727204183.76186: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 41445 1727204183.76370: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 41445 1727204183.76376: stdout chunk (state=3): >>>import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f46bfb3d250> <<< 41445 1727204183.76515: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 41445 1727204183.76520: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f46bfb3dc40> <<< 41445 1727204183.76562: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfb3cb00> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfb11f40> <<< 41445 1727204183.76584: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 41445 1727204183.76704: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py<<< 41445 1727204183.76711: stdout chunk (state=3): >>> <<< 41445 1727204183.76727: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 41445 1727204183.76741: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfb3f050> <<< 41445 1727204183.76789: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfb3dd90> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfd62d80> <<< 41445 1727204183.76820: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 41445 1727204183.76916: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 41445 1727204183.77025: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfb633e0> <<< 41445 1727204183.77061: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 41445 1727204183.77089: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 41445 1727204183.77107: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 41445 1727204183.77235: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfb8b7a0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 41445 1727204183.77269: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 41445 1727204183.77364: stdout chunk (state=3): >>>import 'ntpath' # <<< 41445 1727204183.77392: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py <<< 41445 1727204183.77400: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfbe8590> <<< 41445 1727204183.77422: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 41445 1727204183.77465: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 41445 1727204183.77644: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 41445 1727204183.77684: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfbeacf0> <<< 41445 1727204183.77791: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfbe86b0> <<< 41445 1727204183.77828: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfbb15b0> <<< 41445 1727204183.77880: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bf525700> <<< 41445 1727204183.77926: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfb8a5a0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfb3ffb0> <<< 41445 1727204183.78036: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 41445 1727204183.78243: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f46bfb8a900> <<< 41445 1727204183.78460: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_dau9rhof/ansible_stat_payload.zip' # zipimport: zlib available <<< 41445 1727204183.78554: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 41445 1727204183.78712: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bf577440> import '_typing' # <<< 41445 1727204183.78767: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bf55a330> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bf559490> # zipimport: zlib available <<< 41445 1727204183.78836: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available <<< 41445 1727204183.78850: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 41445 1727204183.78900: stdout chunk (state=3): >>>import 'ansible.module_utils' # # zipimport: zlib available <<< 41445 1727204183.80491: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.82136: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' <<< 41445 1727204183.82143: stdout chunk (state=3): >>>import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bf575310> <<< 41445 1727204183.82170: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 41445 1727204183.82211: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 41445 1727204183.82215: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 41445 1727204183.82240: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 41445 1727204183.82278: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f46bf5a2c90> <<< 41445 1727204183.82328: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bf5a2a20> <<< 41445 1727204183.82362: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bf5a2390> <<< 41445 1727204183.82451: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bf5a27b0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfeea9c0> import 'atexit' # <<< 41445 1727204183.82472: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' <<< 41445 1727204183.82478: stdout chunk (state=3): >>># extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f46bf5a39e0> <<< 41445 1727204183.82507: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f46bf5a3c20> <<< 41445 1727204183.82587: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 41445 1727204183.82605: stdout chunk (state=3): >>>import '_locale' # <<< 41445 1727204183.82660: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bf5c8140> <<< 41445 1727204183.82691: stdout chunk (state=3): >>>import 'pwd' # <<< 41445 1727204183.82712: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 41445 1727204183.82725: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 41445 1727204183.82767: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bf40dee0> <<< 41445 1727204183.82809: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f46bf40f620> <<< 41445 1727204183.82812: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 41445 1727204183.82831: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 41445 1727204183.82871: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bf4144a0> <<< 41445 1727204183.82893: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 41445 1727204183.82998: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 41445 1727204183.83002: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bf415640> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 41445 1727204183.83016: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 41445 1727204183.83126: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bf41c0e0> <<< 41445 1727204183.83135: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f46bf41c200> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bf4163c0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 41445 1727204183.83360: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bf41ffb0> import '_tokenize' # <<< 41445 1727204183.83364: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bf41ea80> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bf41e7e0> <<< 41445 1727204183.83366: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 41445 1727204183.83432: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bf41ed50> <<< 41445 1727204183.83460: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bf4168a0> <<< 41445 1727204183.83490: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f46bf464170> <<< 41445 1727204183.83554: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bf464350> <<< 41445 1727204183.83642: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 41445 1727204183.83658: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f46bf465df0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bf465bb0> <<< 41445 1727204183.83661: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 41445 1727204183.83780: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 41445 1727204183.83818: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f46bf468320> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bf4664e0> <<< 41445 1727204183.83909: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 41445 1727204183.83954: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bf46ba70> <<< 41445 1727204183.84047: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bf468440> <<< 41445 1727204183.84116: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f46bf46cb00> <<< 41445 1727204183.84208: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f46bf46cc80> <<< 41445 1727204183.84214: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f46bf46cc20> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bf4644d0> <<< 41445 1727204183.84234: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 41445 1727204183.84248: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 41445 1727204183.84277: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 41445 1727204183.84290: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 41445 1727204183.84616: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f46bf4f84d0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f46bf4f9820> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bf46ec60> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f46bfd7e270> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bf46e870> # zipimport: zlib available <<< 41445 1727204183.84736: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 41445 1727204183.84757: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.84887: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.84897: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.84907: stdout chunk (state=3): >>>import 'ansible.module_utils.common' # <<< 41445 1727204183.84927: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.84938: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 41445 1727204183.84959: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.85133: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.85320: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.86191: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.87108: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # <<< 41445 1727204183.87112: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 41445 1727204183.87129: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 41445 1727204183.87397: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f46bf4fd8e0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bf4fe720> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bf4f9a00> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available <<< 41445 1727204183.87443: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.87461: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # # zipimport: zlib available <<< 41445 1727204183.87695: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.87925: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 41445 1727204183.87942: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bf4fe6c0> # zipimport: zlib available <<< 41445 1727204183.88679: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.89398: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.89628: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.collections' # <<< 41445 1727204183.89631: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.89672: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.89715: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 41445 1727204183.89730: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.89821: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.89972: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 41445 1727204183.90003: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 41445 1727204183.90056: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.90100: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 41445 1727204183.90117: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.90468: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.90916: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 41445 1727204183.90933: stdout chunk (state=3): >>>import '_ast' # <<< 41445 1727204183.91023: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bf4ff9b0> <<< 41445 1727204183.91042: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.91139: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.91239: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 41445 1727204183.91250: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 41445 1727204183.91266: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 41445 1727204183.91285: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.91343: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.91390: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 41445 1727204183.91465: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 41445 1727204183.91520: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.91636: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.91748: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 41445 1727204183.91801: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 41445 1727204183.91869: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' <<< 41445 1727204183.91874: stdout chunk (state=3): >>>import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f46bf30a360> <<< 41445 1727204183.91922: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bf3051f0> <<< 41445 1727204183.91972: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 41445 1727204183.92091: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.92192: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 41445 1727204183.92243: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 41445 1727204183.92281: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 41445 1727204183.92304: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 41445 1727204183.92339: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 41445 1727204183.92398: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 41445 1727204183.92443: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 41445 1727204183.92519: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bf5debd0> <<< 41445 1727204183.92568: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bf5ee8d0> <<< 41445 1727204183.93016: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bf30a510> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bf4ff350> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available <<< 41445 1727204183.93294: stdout chunk (state=3): >>># zipimport: zlib available <<< 41445 1727204183.93461: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 41445 1727204183.93491: stdout chunk (state=3): >>># destroy __main__ <<< 41445 1727204183.93926: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value <<< 41445 1727204183.93958: stdout chunk (state=3): >>># clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs <<< 41445 1727204183.94032: stdout chunk (state=3): >>># cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 <<< 41445 1727204183.94072: stdout chunk (state=3): >>># cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess <<< 41445 1727204183.94113: stdout chunk (state=3): >>># cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text <<< 41445 1727204183.94136: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 41445 1727204183.94445: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 41445 1727204183.94500: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 <<< 41445 1727204183.94523: stdout chunk (state=3): >>># destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress <<< 41445 1727204183.94543: stdout chunk (state=3): >>># destroy ntpath <<< 41445 1727204183.94586: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings <<< 41445 1727204183.94599: stdout chunk (state=3): >>># destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal <<< 41445 1727204183.94646: stdout chunk (state=3): >>># destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno <<< 41445 1727204183.94657: stdout chunk (state=3): >>># destroy array # destroy datetime # destroy selinux # destroy shutil <<< 41445 1727204183.94733: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string <<< 41445 1727204183.94759: stdout chunk (state=3): >>># cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform <<< 41445 1727204183.94807: stdout chunk (state=3): >>># cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external <<< 41445 1727204183.94834: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types <<< 41445 1727204183.94873: stdout chunk (state=3): >>># cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat <<< 41445 1727204183.94879: stdout chunk (state=3): >>># destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 41445 1727204183.95084: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 41445 1727204183.95283: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 41445 1727204183.95356: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref <<< 41445 1727204183.95433: stdout chunk (state=3): >>># destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread <<< 41445 1727204183.95549: stdout chunk (state=3): >>># clear sys.audit hooks <<< 41445 1727204183.96006: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. <<< 41445 1727204183.96052: stderr chunk (state=3): >>><<< 41445 1727204183.96055: stdout chunk (state=3): >>><<< 41445 1727204183.96152: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfee84d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfeb7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfeeaa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfc99130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfc9a060> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfcd7f50> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfcec0e0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfd0f980> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfd0ff50> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfcefc20> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfced340> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfcd5100> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfd33950> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfd32570> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfcee210> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfd30d70> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfd60950> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfcd4380> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f46bfd60e00> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfd60cb0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f46bfd610a0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfcd2ea0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfd61760> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfd61460> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfd62660> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfd7c860> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f46bfd7dfa0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfd7ee40> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f46bfd7f4a0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfd7e390> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f46bfd7ff20> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfd7f650> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfd62690> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f46bfb13da0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f46bfb3c8f0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfb3c650> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f46bfb3c920> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f46bfb3d250> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f46bfb3dc40> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfb3cb00> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfb11f40> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfb3f050> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfb3dd90> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfd62d80> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfb633e0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfb8b7a0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfbe8590> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfbeacf0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfbe86b0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfbb15b0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bf525700> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfb8a5a0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfb3ffb0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f46bfb8a900> # zipimport: found 30 names in '/tmp/ansible_stat_payload_dau9rhof/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bf577440> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bf55a330> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bf559490> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bf575310> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f46bf5a2c90> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bf5a2a20> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bf5a2390> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bf5a27b0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bfeea9c0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f46bf5a39e0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f46bf5a3c20> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bf5c8140> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bf40dee0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f46bf40f620> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bf4144a0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bf415640> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bf41c0e0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f46bf41c200> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bf4163c0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bf41ffb0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bf41ea80> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bf41e7e0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bf41ed50> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bf4168a0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f46bf464170> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bf464350> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f46bf465df0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bf465bb0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f46bf468320> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bf4664e0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bf46ba70> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bf468440> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f46bf46cb00> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f46bf46cc80> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f46bf46cc20> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bf4644d0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f46bf4f84d0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f46bf4f9820> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bf46ec60> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f46bfd7e270> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bf46e870> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f46bf4fd8e0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bf4fe720> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bf4f9a00> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bf4fe6c0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bf4ff9b0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f46bf30a360> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bf3051f0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bf5debd0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bf5ee8d0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bf30a510> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f46bf4ff350> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 41445 1727204183.97038: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204183.4885695-41816-8571812411502/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41445 1727204183.97041: _low_level_execute_command(): starting 41445 1727204183.97043: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204183.4885695-41816-8571812411502/ > /dev/null 2>&1 && sleep 0' 41445 1727204183.97188: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204183.97191: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204183.97243: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204183.97246: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204183.97295: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204184.00136: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204184.00139: stdout chunk (state=3): >>><<< 41445 1727204184.00142: stderr chunk (state=3): >>><<< 41445 1727204184.00278: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204184.00281: handler run complete 41445 1727204184.00284: attempt loop complete, returning result 41445 1727204184.00286: _execute() done 41445 1727204184.00288: dumping result to json 41445 1727204184.00290: done dumping result, returning 41445 1727204184.00292: done running TaskExecutor() for managed-node3/TASK: Check if system is ostree [028d2410-947f-bf02-eee4-000000000108] 41445 1727204184.00294: sending task result for task 028d2410-947f-bf02-eee4-000000000108 ok: [managed-node3] => { "changed": false, "stat": { "exists": false } } 41445 1727204184.00442: no more pending results, returning what we have 41445 1727204184.00446: results queue empty 41445 1727204184.00447: checking for any_errors_fatal 41445 1727204184.00453: done checking for any_errors_fatal 41445 1727204184.00454: checking for max_fail_percentage 41445 1727204184.00456: done checking for max_fail_percentage 41445 1727204184.00457: checking to see if all hosts have failed and the running result is not ok 41445 1727204184.00458: done checking to see if all hosts have failed 41445 1727204184.00458: getting the remaining hosts for this loop 41445 1727204184.00460: done getting the remaining hosts for this loop 41445 1727204184.00464: getting the next task for host managed-node3 41445 1727204184.00471: done getting next task for host managed-node3 41445 1727204184.00474: ^ task is: TASK: Set flag to indicate system is ostree 41445 1727204184.00603: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204184.00608: getting variables 41445 1727204184.00610: in VariableManager get_vars() 41445 1727204184.00640: Calling all_inventory to load vars for managed-node3 41445 1727204184.00643: Calling groups_inventory to load vars for managed-node3 41445 1727204184.00646: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204184.00657: Calling all_plugins_play to load vars for managed-node3 41445 1727204184.00660: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204184.00663: Calling groups_plugins_play to load vars for managed-node3 41445 1727204184.01171: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204184.01528: done with get_vars() 41445 1727204184.01539: done getting variables 41445 1727204184.01571: done sending task result for task 028d2410-947f-bf02-eee4-000000000108 41445 1727204184.01581: WORKER PROCESS EXITING 41445 1727204184.01655: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Tuesday 24 September 2024 14:56:24 -0400 (0:00:00.581) 0:00:02.804 ***** 41445 1727204184.01703: entering _queue_task() for managed-node3/set_fact 41445 1727204184.01705: Creating lock for set_fact 41445 1727204184.02036: worker is 1 (out of 1 available) 41445 1727204184.02049: exiting _queue_task() for managed-node3/set_fact 41445 1727204184.02058: done queuing things up, now waiting for results queue to drain 41445 1727204184.02060: waiting for pending results... 41445 1727204184.02273: running TaskExecutor() for managed-node3/TASK: Set flag to indicate system is ostree 41445 1727204184.02387: in run() - task 028d2410-947f-bf02-eee4-000000000109 41445 1727204184.02406: variable 'ansible_search_path' from source: unknown 41445 1727204184.02412: variable 'ansible_search_path' from source: unknown 41445 1727204184.02459: calling self._execute() 41445 1727204184.02546: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204184.02568: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204184.02584: variable 'omit' from source: magic vars 41445 1727204184.03223: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41445 1727204184.03468: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41445 1727204184.03531: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41445 1727204184.03594: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41445 1727204184.03656: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41445 1727204184.03738: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41445 1727204184.03778: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41445 1727204184.03819: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204184.03874: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41445 1727204184.03991: Evaluated conditional (not __network_is_ostree is defined): True 41445 1727204184.04003: variable 'omit' from source: magic vars 41445 1727204184.04054: variable 'omit' from source: magic vars 41445 1727204184.04202: variable '__ostree_booted_stat' from source: set_fact 41445 1727204184.04282: variable 'omit' from source: magic vars 41445 1727204184.04288: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41445 1727204184.04336: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41445 1727204184.04366: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41445 1727204184.04419: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204184.04422: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204184.04437: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41445 1727204184.04445: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204184.04460: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204184.04787: Set connection var ansible_shell_executable to /bin/sh 41445 1727204184.04791: Set connection var ansible_shell_type to sh 41445 1727204184.04793: Set connection var ansible_pipelining to False 41445 1727204184.04795: Set connection var ansible_timeout to 10 41445 1727204184.04796: Set connection var ansible_connection to ssh 41445 1727204184.04798: Set connection var ansible_module_compression to ZIP_DEFLATED 41445 1727204184.04819: variable 'ansible_shell_executable' from source: unknown 41445 1727204184.04828: variable 'ansible_connection' from source: unknown 41445 1727204184.04835: variable 'ansible_module_compression' from source: unknown 41445 1727204184.05004: variable 'ansible_shell_type' from source: unknown 41445 1727204184.05006: variable 'ansible_shell_executable' from source: unknown 41445 1727204184.05008: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204184.05010: variable 'ansible_pipelining' from source: unknown 41445 1727204184.05012: variable 'ansible_timeout' from source: unknown 41445 1727204184.05013: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204184.05116: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41445 1727204184.05131: variable 'omit' from source: magic vars 41445 1727204184.05138: starting attempt loop 41445 1727204184.05143: running the handler 41445 1727204184.05158: handler run complete 41445 1727204184.05196: attempt loop complete, returning result 41445 1727204184.05204: _execute() done 41445 1727204184.05216: dumping result to json 41445 1727204184.05228: done dumping result, returning 41445 1727204184.05237: done running TaskExecutor() for managed-node3/TASK: Set flag to indicate system is ostree [028d2410-947f-bf02-eee4-000000000109] 41445 1727204184.05249: sending task result for task 028d2410-947f-bf02-eee4-000000000109 ok: [managed-node3] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 41445 1727204184.05464: no more pending results, returning what we have 41445 1727204184.05467: results queue empty 41445 1727204184.05468: checking for any_errors_fatal 41445 1727204184.05474: done checking for any_errors_fatal 41445 1727204184.05477: checking for max_fail_percentage 41445 1727204184.05479: done checking for max_fail_percentage 41445 1727204184.05480: checking to see if all hosts have failed and the running result is not ok 41445 1727204184.05481: done checking to see if all hosts have failed 41445 1727204184.05482: getting the remaining hosts for this loop 41445 1727204184.05483: done getting the remaining hosts for this loop 41445 1727204184.05487: getting the next task for host managed-node3 41445 1727204184.05495: done getting next task for host managed-node3 41445 1727204184.05498: ^ task is: TASK: Fix CentOS6 Base repo 41445 1727204184.05500: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204184.05505: getting variables 41445 1727204184.05507: in VariableManager get_vars() 41445 1727204184.05536: Calling all_inventory to load vars for managed-node3 41445 1727204184.05539: Calling groups_inventory to load vars for managed-node3 41445 1727204184.05543: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204184.05555: Calling all_plugins_play to load vars for managed-node3 41445 1727204184.05558: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204184.05565: Calling groups_plugins_play to load vars for managed-node3 41445 1727204184.05584: done sending task result for task 028d2410-947f-bf02-eee4-000000000109 41445 1727204184.05593: WORKER PROCESS EXITING 41445 1727204184.05999: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204184.06214: done with get_vars() 41445 1727204184.06223: done getting variables 41445 1727204184.06342: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Tuesday 24 September 2024 14:56:24 -0400 (0:00:00.046) 0:00:02.851 ***** 41445 1727204184.06367: entering _queue_task() for managed-node3/copy 41445 1727204184.06610: worker is 1 (out of 1 available) 41445 1727204184.06622: exiting _queue_task() for managed-node3/copy 41445 1727204184.06632: done queuing things up, now waiting for results queue to drain 41445 1727204184.06633: waiting for pending results... 41445 1727204184.06870: running TaskExecutor() for managed-node3/TASK: Fix CentOS6 Base repo 41445 1727204184.06957: in run() - task 028d2410-947f-bf02-eee4-00000000010b 41445 1727204184.07218: variable 'ansible_search_path' from source: unknown 41445 1727204184.07221: variable 'ansible_search_path' from source: unknown 41445 1727204184.07252: calling self._execute() 41445 1727204184.07434: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204184.07437: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204184.07439: variable 'omit' from source: magic vars 41445 1727204184.07829: variable 'ansible_distribution' from source: facts 41445 1727204184.07852: Evaluated conditional (ansible_distribution == 'CentOS'): True 41445 1727204184.07979: variable 'ansible_distribution_major_version' from source: facts 41445 1727204184.07992: Evaluated conditional (ansible_distribution_major_version == '6'): False 41445 1727204184.07998: when evaluation is False, skipping this task 41445 1727204184.08004: _execute() done 41445 1727204184.08012: dumping result to json 41445 1727204184.08019: done dumping result, returning 41445 1727204184.08029: done running TaskExecutor() for managed-node3/TASK: Fix CentOS6 Base repo [028d2410-947f-bf02-eee4-00000000010b] 41445 1727204184.08038: sending task result for task 028d2410-947f-bf02-eee4-00000000010b 41445 1727204184.08283: done sending task result for task 028d2410-947f-bf02-eee4-00000000010b 41445 1727204184.08286: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 41445 1727204184.08349: no more pending results, returning what we have 41445 1727204184.08353: results queue empty 41445 1727204184.08354: checking for any_errors_fatal 41445 1727204184.08357: done checking for any_errors_fatal 41445 1727204184.08358: checking for max_fail_percentage 41445 1727204184.08360: done checking for max_fail_percentage 41445 1727204184.08361: checking to see if all hosts have failed and the running result is not ok 41445 1727204184.08361: done checking to see if all hosts have failed 41445 1727204184.08362: getting the remaining hosts for this loop 41445 1727204184.08363: done getting the remaining hosts for this loop 41445 1727204184.08366: getting the next task for host managed-node3 41445 1727204184.08371: done getting next task for host managed-node3 41445 1727204184.08373: ^ task is: TASK: Include the task 'enable_epel.yml' 41445 1727204184.08378: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204184.08382: getting variables 41445 1727204184.08383: in VariableManager get_vars() 41445 1727204184.08406: Calling all_inventory to load vars for managed-node3 41445 1727204184.08412: Calling groups_inventory to load vars for managed-node3 41445 1727204184.08415: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204184.08425: Calling all_plugins_play to load vars for managed-node3 41445 1727204184.08428: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204184.08430: Calling groups_plugins_play to load vars for managed-node3 41445 1727204184.08691: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204184.08906: done with get_vars() 41445 1727204184.08918: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Tuesday 24 September 2024 14:56:24 -0400 (0:00:00.026) 0:00:02.877 ***** 41445 1727204184.09001: entering _queue_task() for managed-node3/include_tasks 41445 1727204184.09228: worker is 1 (out of 1 available) 41445 1727204184.09240: exiting _queue_task() for managed-node3/include_tasks 41445 1727204184.09250: done queuing things up, now waiting for results queue to drain 41445 1727204184.09251: waiting for pending results... 41445 1727204184.09481: running TaskExecutor() for managed-node3/TASK: Include the task 'enable_epel.yml' 41445 1727204184.09589: in run() - task 028d2410-947f-bf02-eee4-00000000010c 41445 1727204184.09614: variable 'ansible_search_path' from source: unknown 41445 1727204184.09622: variable 'ansible_search_path' from source: unknown 41445 1727204184.09660: calling self._execute() 41445 1727204184.09744: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204184.09882: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204184.09885: variable 'omit' from source: magic vars 41445 1727204184.10306: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41445 1727204184.12637: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41445 1727204184.12700: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41445 1727204184.12743: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41445 1727204184.12794: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41445 1727204184.12834: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41445 1727204184.12919: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204184.12958: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204184.12990: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204184.13383: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204184.13386: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204184.13491: variable '__network_is_ostree' from source: set_fact 41445 1727204184.13494: Evaluated conditional (not __network_is_ostree | d(false)): True 41445 1727204184.13497: _execute() done 41445 1727204184.13499: dumping result to json 41445 1727204184.13502: done dumping result, returning 41445 1727204184.13504: done running TaskExecutor() for managed-node3/TASK: Include the task 'enable_epel.yml' [028d2410-947f-bf02-eee4-00000000010c] 41445 1727204184.13506: sending task result for task 028d2410-947f-bf02-eee4-00000000010c 41445 1727204184.13623: no more pending results, returning what we have 41445 1727204184.13628: in VariableManager get_vars() 41445 1727204184.13666: Calling all_inventory to load vars for managed-node3 41445 1727204184.13670: Calling groups_inventory to load vars for managed-node3 41445 1727204184.13674: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204184.13688: Calling all_plugins_play to load vars for managed-node3 41445 1727204184.13691: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204184.13693: Calling groups_plugins_play to load vars for managed-node3 41445 1727204184.14381: done sending task result for task 028d2410-947f-bf02-eee4-00000000010c 41445 1727204184.14385: WORKER PROCESS EXITING 41445 1727204184.14407: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204184.14874: done with get_vars() 41445 1727204184.14884: variable 'ansible_search_path' from source: unknown 41445 1727204184.14885: variable 'ansible_search_path' from source: unknown 41445 1727204184.14918: we have included files to process 41445 1727204184.14919: generating all_blocks data 41445 1727204184.14921: done generating all_blocks data 41445 1727204184.14925: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 41445 1727204184.14926: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 41445 1727204184.14928: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 41445 1727204184.15942: done processing included file 41445 1727204184.15945: iterating over new_blocks loaded from include file 41445 1727204184.15946: in VariableManager get_vars() 41445 1727204184.15959: done with get_vars() 41445 1727204184.15961: filtering new block on tags 41445 1727204184.16136: done filtering new block on tags 41445 1727204184.16140: in VariableManager get_vars() 41445 1727204184.16151: done with get_vars() 41445 1727204184.16153: filtering new block on tags 41445 1727204184.16165: done filtering new block on tags 41445 1727204184.16167: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed-node3 41445 1727204184.16172: extending task lists for all hosts with included blocks 41445 1727204184.16266: done extending task lists 41445 1727204184.16267: done processing included files 41445 1727204184.16268: results queue empty 41445 1727204184.16269: checking for any_errors_fatal 41445 1727204184.16271: done checking for any_errors_fatal 41445 1727204184.16272: checking for max_fail_percentage 41445 1727204184.16273: done checking for max_fail_percentage 41445 1727204184.16274: checking to see if all hosts have failed and the running result is not ok 41445 1727204184.16275: done checking to see if all hosts have failed 41445 1727204184.16277: getting the remaining hosts for this loop 41445 1727204184.16278: done getting the remaining hosts for this loop 41445 1727204184.16280: getting the next task for host managed-node3 41445 1727204184.16284: done getting next task for host managed-node3 41445 1727204184.16286: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 41445 1727204184.16289: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204184.16291: getting variables 41445 1727204184.16292: in VariableManager get_vars() 41445 1727204184.16300: Calling all_inventory to load vars for managed-node3 41445 1727204184.16303: Calling groups_inventory to load vars for managed-node3 41445 1727204184.16305: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204184.16311: Calling all_plugins_play to load vars for managed-node3 41445 1727204184.16318: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204184.16321: Calling groups_plugins_play to load vars for managed-node3 41445 1727204184.16458: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204184.16648: done with get_vars() 41445 1727204184.16657: done getting variables 41445 1727204184.16724: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 41445 1727204184.16917: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 10] ********************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Tuesday 24 September 2024 14:56:24 -0400 (0:00:00.079) 0:00:02.957 ***** 41445 1727204184.16961: entering _queue_task() for managed-node3/command 41445 1727204184.16962: Creating lock for command 41445 1727204184.17302: worker is 1 (out of 1 available) 41445 1727204184.17313: exiting _queue_task() for managed-node3/command 41445 1727204184.17324: done queuing things up, now waiting for results queue to drain 41445 1727204184.17326: waiting for pending results... 41445 1727204184.17567: running TaskExecutor() for managed-node3/TASK: Create EPEL 10 41445 1727204184.17684: in run() - task 028d2410-947f-bf02-eee4-000000000126 41445 1727204184.17704: variable 'ansible_search_path' from source: unknown 41445 1727204184.17711: variable 'ansible_search_path' from source: unknown 41445 1727204184.17751: calling self._execute() 41445 1727204184.17833: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204184.17845: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204184.17859: variable 'omit' from source: magic vars 41445 1727204184.18272: variable 'ansible_distribution' from source: facts 41445 1727204184.18291: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 41445 1727204184.18420: variable 'ansible_distribution_major_version' from source: facts 41445 1727204184.18431: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 41445 1727204184.18443: when evaluation is False, skipping this task 41445 1727204184.18450: _execute() done 41445 1727204184.18457: dumping result to json 41445 1727204184.18466: done dumping result, returning 41445 1727204184.18478: done running TaskExecutor() for managed-node3/TASK: Create EPEL 10 [028d2410-947f-bf02-eee4-000000000126] 41445 1727204184.18491: sending task result for task 028d2410-947f-bf02-eee4-000000000126 skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 41445 1727204184.18646: no more pending results, returning what we have 41445 1727204184.18649: results queue empty 41445 1727204184.18650: checking for any_errors_fatal 41445 1727204184.18651: done checking for any_errors_fatal 41445 1727204184.18652: checking for max_fail_percentage 41445 1727204184.18653: done checking for max_fail_percentage 41445 1727204184.18654: checking to see if all hosts have failed and the running result is not ok 41445 1727204184.18655: done checking to see if all hosts have failed 41445 1727204184.18656: getting the remaining hosts for this loop 41445 1727204184.18657: done getting the remaining hosts for this loop 41445 1727204184.18660: getting the next task for host managed-node3 41445 1727204184.18667: done getting next task for host managed-node3 41445 1727204184.18670: ^ task is: TASK: Install yum-utils package 41445 1727204184.18674: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204184.18680: getting variables 41445 1727204184.18682: in VariableManager get_vars() 41445 1727204184.18821: Calling all_inventory to load vars for managed-node3 41445 1727204184.18825: Calling groups_inventory to load vars for managed-node3 41445 1727204184.18828: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204184.19083: Calling all_plugins_play to load vars for managed-node3 41445 1727204184.19087: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204184.19091: Calling groups_plugins_play to load vars for managed-node3 41445 1727204184.19339: done sending task result for task 028d2410-947f-bf02-eee4-000000000126 41445 1727204184.19342: WORKER PROCESS EXITING 41445 1727204184.19361: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204184.19559: done with get_vars() 41445 1727204184.19568: done getting variables 41445 1727204184.19661: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Tuesday 24 September 2024 14:56:24 -0400 (0:00:00.027) 0:00:02.984 ***** 41445 1727204184.19688: entering _queue_task() for managed-node3/package 41445 1727204184.19690: Creating lock for package 41445 1727204184.19928: worker is 1 (out of 1 available) 41445 1727204184.19940: exiting _queue_task() for managed-node3/package 41445 1727204184.19950: done queuing things up, now waiting for results queue to drain 41445 1727204184.19952: waiting for pending results... 41445 1727204184.20178: running TaskExecutor() for managed-node3/TASK: Install yum-utils package 41445 1727204184.20285: in run() - task 028d2410-947f-bf02-eee4-000000000127 41445 1727204184.20309: variable 'ansible_search_path' from source: unknown 41445 1727204184.20318: variable 'ansible_search_path' from source: unknown 41445 1727204184.20354: calling self._execute() 41445 1727204184.20469: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204184.20481: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204184.20496: variable 'omit' from source: magic vars 41445 1727204184.20908: variable 'ansible_distribution' from source: facts 41445 1727204184.20925: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 41445 1727204184.21061: variable 'ansible_distribution_major_version' from source: facts 41445 1727204184.21072: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 41445 1727204184.21081: when evaluation is False, skipping this task 41445 1727204184.21087: _execute() done 41445 1727204184.21094: dumping result to json 41445 1727204184.21100: done dumping result, returning 41445 1727204184.21110: done running TaskExecutor() for managed-node3/TASK: Install yum-utils package [028d2410-947f-bf02-eee4-000000000127] 41445 1727204184.21120: sending task result for task 028d2410-947f-bf02-eee4-000000000127 skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 41445 1727204184.21416: no more pending results, returning what we have 41445 1727204184.21419: results queue empty 41445 1727204184.21420: checking for any_errors_fatal 41445 1727204184.21424: done checking for any_errors_fatal 41445 1727204184.21425: checking for max_fail_percentage 41445 1727204184.21426: done checking for max_fail_percentage 41445 1727204184.21427: checking to see if all hosts have failed and the running result is not ok 41445 1727204184.21428: done checking to see if all hosts have failed 41445 1727204184.21429: getting the remaining hosts for this loop 41445 1727204184.21430: done getting the remaining hosts for this loop 41445 1727204184.21433: getting the next task for host managed-node3 41445 1727204184.21438: done getting next task for host managed-node3 41445 1727204184.21440: ^ task is: TASK: Enable EPEL 7 41445 1727204184.21444: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204184.21447: getting variables 41445 1727204184.21448: in VariableManager get_vars() 41445 1727204184.21472: Calling all_inventory to load vars for managed-node3 41445 1727204184.21477: Calling groups_inventory to load vars for managed-node3 41445 1727204184.21480: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204184.21489: Calling all_plugins_play to load vars for managed-node3 41445 1727204184.21491: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204184.21494: Calling groups_plugins_play to load vars for managed-node3 41445 1727204184.21729: done sending task result for task 028d2410-947f-bf02-eee4-000000000127 41445 1727204184.21732: WORKER PROCESS EXITING 41445 1727204184.21753: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204184.21990: done with get_vars() 41445 1727204184.21999: done getting variables 41445 1727204184.22055: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Tuesday 24 September 2024 14:56:24 -0400 (0:00:00.023) 0:00:03.008 ***** 41445 1727204184.22083: entering _queue_task() for managed-node3/command 41445 1727204184.22307: worker is 1 (out of 1 available) 41445 1727204184.22319: exiting _queue_task() for managed-node3/command 41445 1727204184.22329: done queuing things up, now waiting for results queue to drain 41445 1727204184.22330: waiting for pending results... 41445 1727204184.22552: running TaskExecutor() for managed-node3/TASK: Enable EPEL 7 41445 1727204184.22682: in run() - task 028d2410-947f-bf02-eee4-000000000128 41445 1727204184.22710: variable 'ansible_search_path' from source: unknown 41445 1727204184.22719: variable 'ansible_search_path' from source: unknown 41445 1727204184.22753: calling self._execute() 41445 1727204184.22829: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204184.22840: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204184.22853: variable 'omit' from source: magic vars 41445 1727204184.23222: variable 'ansible_distribution' from source: facts 41445 1727204184.23240: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 41445 1727204184.23369: variable 'ansible_distribution_major_version' from source: facts 41445 1727204184.23386: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 41445 1727204184.23394: when evaluation is False, skipping this task 41445 1727204184.23400: _execute() done 41445 1727204184.23406: dumping result to json 41445 1727204184.23415: done dumping result, returning 41445 1727204184.23430: done running TaskExecutor() for managed-node3/TASK: Enable EPEL 7 [028d2410-947f-bf02-eee4-000000000128] 41445 1727204184.23440: sending task result for task 028d2410-947f-bf02-eee4-000000000128 skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 41445 1727204184.23717: no more pending results, returning what we have 41445 1727204184.23721: results queue empty 41445 1727204184.23721: checking for any_errors_fatal 41445 1727204184.23728: done checking for any_errors_fatal 41445 1727204184.23729: checking for max_fail_percentage 41445 1727204184.23731: done checking for max_fail_percentage 41445 1727204184.23732: checking to see if all hosts have failed and the running result is not ok 41445 1727204184.23733: done checking to see if all hosts have failed 41445 1727204184.23734: getting the remaining hosts for this loop 41445 1727204184.23735: done getting the remaining hosts for this loop 41445 1727204184.23738: getting the next task for host managed-node3 41445 1727204184.23744: done getting next task for host managed-node3 41445 1727204184.23747: ^ task is: TASK: Enable EPEL 8 41445 1727204184.23751: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204184.23755: getting variables 41445 1727204184.23756: in VariableManager get_vars() 41445 1727204184.23785: Calling all_inventory to load vars for managed-node3 41445 1727204184.23788: Calling groups_inventory to load vars for managed-node3 41445 1727204184.23791: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204184.23802: Calling all_plugins_play to load vars for managed-node3 41445 1727204184.23805: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204184.23808: Calling groups_plugins_play to load vars for managed-node3 41445 1727204184.24045: done sending task result for task 028d2410-947f-bf02-eee4-000000000128 41445 1727204184.24048: WORKER PROCESS EXITING 41445 1727204184.24072: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204184.24287: done with get_vars() 41445 1727204184.24297: done getting variables 41445 1727204184.24354: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Tuesday 24 September 2024 14:56:24 -0400 (0:00:00.023) 0:00:03.031 ***** 41445 1727204184.24388: entering _queue_task() for managed-node3/command 41445 1727204184.24643: worker is 1 (out of 1 available) 41445 1727204184.24655: exiting _queue_task() for managed-node3/command 41445 1727204184.24668: done queuing things up, now waiting for results queue to drain 41445 1727204184.24669: waiting for pending results... 41445 1727204184.24923: running TaskExecutor() for managed-node3/TASK: Enable EPEL 8 41445 1727204184.25039: in run() - task 028d2410-947f-bf02-eee4-000000000129 41445 1727204184.25059: variable 'ansible_search_path' from source: unknown 41445 1727204184.25068: variable 'ansible_search_path' from source: unknown 41445 1727204184.25119: calling self._execute() 41445 1727204184.25204: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204184.25217: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204184.25229: variable 'omit' from source: magic vars 41445 1727204184.25596: variable 'ansible_distribution' from source: facts 41445 1727204184.25616: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 41445 1727204184.25750: variable 'ansible_distribution_major_version' from source: facts 41445 1727204184.25763: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 41445 1727204184.25770: when evaluation is False, skipping this task 41445 1727204184.25779: _execute() done 41445 1727204184.25787: dumping result to json 41445 1727204184.25794: done dumping result, returning 41445 1727204184.25805: done running TaskExecutor() for managed-node3/TASK: Enable EPEL 8 [028d2410-947f-bf02-eee4-000000000129] 41445 1727204184.25816: sending task result for task 028d2410-947f-bf02-eee4-000000000129 skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 41445 1727204184.26026: no more pending results, returning what we have 41445 1727204184.26030: results queue empty 41445 1727204184.26031: checking for any_errors_fatal 41445 1727204184.26037: done checking for any_errors_fatal 41445 1727204184.26037: checking for max_fail_percentage 41445 1727204184.26039: done checking for max_fail_percentage 41445 1727204184.26040: checking to see if all hosts have failed and the running result is not ok 41445 1727204184.26041: done checking to see if all hosts have failed 41445 1727204184.26042: getting the remaining hosts for this loop 41445 1727204184.26043: done getting the remaining hosts for this loop 41445 1727204184.26046: getting the next task for host managed-node3 41445 1727204184.26056: done getting next task for host managed-node3 41445 1727204184.26058: ^ task is: TASK: Enable EPEL 6 41445 1727204184.26063: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204184.26067: getting variables 41445 1727204184.26069: in VariableManager get_vars() 41445 1727204184.26103: Calling all_inventory to load vars for managed-node3 41445 1727204184.26106: Calling groups_inventory to load vars for managed-node3 41445 1727204184.26111: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204184.26124: Calling all_plugins_play to load vars for managed-node3 41445 1727204184.26127: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204184.26131: Calling groups_plugins_play to load vars for managed-node3 41445 1727204184.26710: done sending task result for task 028d2410-947f-bf02-eee4-000000000129 41445 1727204184.26713: WORKER PROCESS EXITING 41445 1727204184.26737: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204184.27130: done with get_vars() 41445 1727204184.27140: done getting variables 41445 1727204184.27199: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Tuesday 24 September 2024 14:56:24 -0400 (0:00:00.028) 0:00:03.060 ***** 41445 1727204184.27225: entering _queue_task() for managed-node3/copy 41445 1727204184.27848: worker is 1 (out of 1 available) 41445 1727204184.27860: exiting _queue_task() for managed-node3/copy 41445 1727204184.27870: done queuing things up, now waiting for results queue to drain 41445 1727204184.27872: waiting for pending results... 41445 1727204184.28239: running TaskExecutor() for managed-node3/TASK: Enable EPEL 6 41445 1727204184.28462: in run() - task 028d2410-947f-bf02-eee4-00000000012b 41445 1727204184.28542: variable 'ansible_search_path' from source: unknown 41445 1727204184.28551: variable 'ansible_search_path' from source: unknown 41445 1727204184.28649: calling self._execute() 41445 1727204184.28801: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204184.28916: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204184.28920: variable 'omit' from source: magic vars 41445 1727204184.29704: variable 'ansible_distribution' from source: facts 41445 1727204184.29843: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 41445 1727204184.30014: variable 'ansible_distribution_major_version' from source: facts 41445 1727204184.30069: Evaluated conditional (ansible_distribution_major_version == '6'): False 41445 1727204184.30077: when evaluation is False, skipping this task 41445 1727204184.30084: _execute() done 41445 1727204184.30092: dumping result to json 41445 1727204184.30102: done dumping result, returning 41445 1727204184.30119: done running TaskExecutor() for managed-node3/TASK: Enable EPEL 6 [028d2410-947f-bf02-eee4-00000000012b] 41445 1727204184.30218: sending task result for task 028d2410-947f-bf02-eee4-00000000012b 41445 1727204184.30499: done sending task result for task 028d2410-947f-bf02-eee4-00000000012b 41445 1727204184.30502: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 41445 1727204184.30542: no more pending results, returning what we have 41445 1727204184.30544: results queue empty 41445 1727204184.30545: checking for any_errors_fatal 41445 1727204184.30551: done checking for any_errors_fatal 41445 1727204184.30551: checking for max_fail_percentage 41445 1727204184.30553: done checking for max_fail_percentage 41445 1727204184.30553: checking to see if all hosts have failed and the running result is not ok 41445 1727204184.30554: done checking to see if all hosts have failed 41445 1727204184.30555: getting the remaining hosts for this loop 41445 1727204184.30556: done getting the remaining hosts for this loop 41445 1727204184.30558: getting the next task for host managed-node3 41445 1727204184.30566: done getting next task for host managed-node3 41445 1727204184.30569: ^ task is: TASK: Set network provider to 'nm' 41445 1727204184.30571: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204184.30574: getting variables 41445 1727204184.30577: in VariableManager get_vars() 41445 1727204184.30601: Calling all_inventory to load vars for managed-node3 41445 1727204184.30604: Calling groups_inventory to load vars for managed-node3 41445 1727204184.30607: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204184.30784: Calling all_plugins_play to load vars for managed-node3 41445 1727204184.30788: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204184.30792: Calling groups_plugins_play to load vars for managed-node3 41445 1727204184.31110: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204184.31704: done with get_vars() 41445 1727204184.31715: done getting variables 41445 1727204184.32018: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tests_route_table_nm.yml:13 Tuesday 24 September 2024 14:56:24 -0400 (0:00:00.048) 0:00:03.108 ***** 41445 1727204184.32049: entering _queue_task() for managed-node3/set_fact 41445 1727204184.32964: worker is 1 (out of 1 available) 41445 1727204184.33178: exiting _queue_task() for managed-node3/set_fact 41445 1727204184.33189: done queuing things up, now waiting for results queue to drain 41445 1727204184.33191: waiting for pending results... 41445 1727204184.33339: running TaskExecutor() for managed-node3/TASK: Set network provider to 'nm' 41445 1727204184.33632: in run() - task 028d2410-947f-bf02-eee4-000000000007 41445 1727204184.33635: variable 'ansible_search_path' from source: unknown 41445 1727204184.33638: calling self._execute() 41445 1727204184.33861: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204184.33872: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204184.34480: variable 'omit' from source: magic vars 41445 1727204184.34640: variable 'omit' from source: magic vars 41445 1727204184.34644: variable 'omit' from source: magic vars 41445 1727204184.34647: variable 'omit' from source: magic vars 41445 1727204184.34767: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41445 1727204184.34817: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41445 1727204184.35283: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41445 1727204184.35286: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204184.35290: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204184.35293: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41445 1727204184.35296: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204184.35299: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204184.35302: Set connection var ansible_shell_executable to /bin/sh 41445 1727204184.35305: Set connection var ansible_shell_type to sh 41445 1727204184.35308: Set connection var ansible_pipelining to False 41445 1727204184.35482: Set connection var ansible_timeout to 10 41445 1727204184.35490: Set connection var ansible_connection to ssh 41445 1727204184.35504: Set connection var ansible_module_compression to ZIP_DEFLATED 41445 1727204184.35542: variable 'ansible_shell_executable' from source: unknown 41445 1727204184.35551: variable 'ansible_connection' from source: unknown 41445 1727204184.35558: variable 'ansible_module_compression' from source: unknown 41445 1727204184.35568: variable 'ansible_shell_type' from source: unknown 41445 1727204184.35576: variable 'ansible_shell_executable' from source: unknown 41445 1727204184.35585: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204184.35592: variable 'ansible_pipelining' from source: unknown 41445 1727204184.35598: variable 'ansible_timeout' from source: unknown 41445 1727204184.35605: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204184.35783: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41445 1727204184.35800: variable 'omit' from source: magic vars 41445 1727204184.35809: starting attempt loop 41445 1727204184.35815: running the handler 41445 1727204184.35831: handler run complete 41445 1727204184.35849: attempt loop complete, returning result 41445 1727204184.35857: _execute() done 41445 1727204184.35867: dumping result to json 41445 1727204184.35879: done dumping result, returning 41445 1727204184.35982: done running TaskExecutor() for managed-node3/TASK: Set network provider to 'nm' [028d2410-947f-bf02-eee4-000000000007] 41445 1727204184.35985: sending task result for task 028d2410-947f-bf02-eee4-000000000007 41445 1727204184.36054: done sending task result for task 028d2410-947f-bf02-eee4-000000000007 41445 1727204184.36057: WORKER PROCESS EXITING ok: [managed-node3] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 41445 1727204184.36146: no more pending results, returning what we have 41445 1727204184.36149: results queue empty 41445 1727204184.36150: checking for any_errors_fatal 41445 1727204184.36156: done checking for any_errors_fatal 41445 1727204184.36156: checking for max_fail_percentage 41445 1727204184.36159: done checking for max_fail_percentage 41445 1727204184.36160: checking to see if all hosts have failed and the running result is not ok 41445 1727204184.36160: done checking to see if all hosts have failed 41445 1727204184.36161: getting the remaining hosts for this loop 41445 1727204184.36162: done getting the remaining hosts for this loop 41445 1727204184.36166: getting the next task for host managed-node3 41445 1727204184.36173: done getting next task for host managed-node3 41445 1727204184.36178: ^ task is: TASK: meta (flush_handlers) 41445 1727204184.36180: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204184.36185: getting variables 41445 1727204184.36186: in VariableManager get_vars() 41445 1727204184.36331: Calling all_inventory to load vars for managed-node3 41445 1727204184.36334: Calling groups_inventory to load vars for managed-node3 41445 1727204184.36338: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204184.36348: Calling all_plugins_play to load vars for managed-node3 41445 1727204184.36351: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204184.36353: Calling groups_plugins_play to load vars for managed-node3 41445 1727204184.36683: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204184.36897: done with get_vars() 41445 1727204184.36906: done getting variables 41445 1727204184.36967: in VariableManager get_vars() 41445 1727204184.36982: Calling all_inventory to load vars for managed-node3 41445 1727204184.36984: Calling groups_inventory to load vars for managed-node3 41445 1727204184.36986: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204184.36991: Calling all_plugins_play to load vars for managed-node3 41445 1727204184.36993: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204184.36996: Calling groups_plugins_play to load vars for managed-node3 41445 1727204184.37142: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204184.37344: done with get_vars() 41445 1727204184.37357: done queuing things up, now waiting for results queue to drain 41445 1727204184.37359: results queue empty 41445 1727204184.37360: checking for any_errors_fatal 41445 1727204184.37362: done checking for any_errors_fatal 41445 1727204184.37363: checking for max_fail_percentage 41445 1727204184.37364: done checking for max_fail_percentage 41445 1727204184.37364: checking to see if all hosts have failed and the running result is not ok 41445 1727204184.37365: done checking to see if all hosts have failed 41445 1727204184.37366: getting the remaining hosts for this loop 41445 1727204184.37367: done getting the remaining hosts for this loop 41445 1727204184.37369: getting the next task for host managed-node3 41445 1727204184.37372: done getting next task for host managed-node3 41445 1727204184.37373: ^ task is: TASK: meta (flush_handlers) 41445 1727204184.37377: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204184.37385: getting variables 41445 1727204184.37386: in VariableManager get_vars() 41445 1727204184.37393: Calling all_inventory to load vars for managed-node3 41445 1727204184.37395: Calling groups_inventory to load vars for managed-node3 41445 1727204184.37397: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204184.37409: Calling all_plugins_play to load vars for managed-node3 41445 1727204184.37411: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204184.37414: Calling groups_plugins_play to load vars for managed-node3 41445 1727204184.37759: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204184.38315: done with get_vars() 41445 1727204184.38323: done getting variables 41445 1727204184.38365: in VariableManager get_vars() 41445 1727204184.38373: Calling all_inventory to load vars for managed-node3 41445 1727204184.38404: Calling groups_inventory to load vars for managed-node3 41445 1727204184.38407: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204184.38412: Calling all_plugins_play to load vars for managed-node3 41445 1727204184.38414: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204184.38417: Calling groups_plugins_play to load vars for managed-node3 41445 1727204184.38672: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204184.39128: done with get_vars() 41445 1727204184.39140: done queuing things up, now waiting for results queue to drain 41445 1727204184.39142: results queue empty 41445 1727204184.39143: checking for any_errors_fatal 41445 1727204184.39144: done checking for any_errors_fatal 41445 1727204184.39145: checking for max_fail_percentage 41445 1727204184.39146: done checking for max_fail_percentage 41445 1727204184.39146: checking to see if all hosts have failed and the running result is not ok 41445 1727204184.39147: done checking to see if all hosts have failed 41445 1727204184.39148: getting the remaining hosts for this loop 41445 1727204184.39148: done getting the remaining hosts for this loop 41445 1727204184.39151: getting the next task for host managed-node3 41445 1727204184.39154: done getting next task for host managed-node3 41445 1727204184.39155: ^ task is: None 41445 1727204184.39156: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204184.39157: done queuing things up, now waiting for results queue to drain 41445 1727204184.39158: results queue empty 41445 1727204184.39163: checking for any_errors_fatal 41445 1727204184.39164: done checking for any_errors_fatal 41445 1727204184.39165: checking for max_fail_percentage 41445 1727204184.39166: done checking for max_fail_percentage 41445 1727204184.39167: checking to see if all hosts have failed and the running result is not ok 41445 1727204184.39167: done checking to see if all hosts have failed 41445 1727204184.39169: getting the next task for host managed-node3 41445 1727204184.39172: done getting next task for host managed-node3 41445 1727204184.39172: ^ task is: None 41445 1727204184.39173: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204184.39244: in VariableManager get_vars() 41445 1727204184.39273: done with get_vars() 41445 1727204184.39282: in VariableManager get_vars() 41445 1727204184.39302: done with get_vars() 41445 1727204184.39308: variable 'omit' from source: magic vars 41445 1727204184.39341: in VariableManager get_vars() 41445 1727204184.39357: done with get_vars() 41445 1727204184.39399: variable 'omit' from source: magic vars PLAY [Play for testing route table] ******************************************** 41445 1727204184.39764: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 41445 1727204184.39790: getting the remaining hosts for this loop 41445 1727204184.39792: done getting the remaining hosts for this loop 41445 1727204184.39794: getting the next task for host managed-node3 41445 1727204184.39797: done getting next task for host managed-node3 41445 1727204184.39798: ^ task is: TASK: Gathering Facts 41445 1727204184.39800: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204184.39802: getting variables 41445 1727204184.39803: in VariableManager get_vars() 41445 1727204184.39819: Calling all_inventory to load vars for managed-node3 41445 1727204184.39822: Calling groups_inventory to load vars for managed-node3 41445 1727204184.39824: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204184.39829: Calling all_plugins_play to load vars for managed-node3 41445 1727204184.39843: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204184.39846: Calling groups_plugins_play to load vars for managed-node3 41445 1727204184.40044: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204184.40252: done with get_vars() 41445 1727204184.40261: done getting variables 41445 1727204184.40302: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:3 Tuesday 24 September 2024 14:56:24 -0400 (0:00:00.082) 0:00:03.191 ***** 41445 1727204184.40325: entering _queue_task() for managed-node3/gather_facts 41445 1727204184.40716: worker is 1 (out of 1 available) 41445 1727204184.40727: exiting _queue_task() for managed-node3/gather_facts 41445 1727204184.40735: done queuing things up, now waiting for results queue to drain 41445 1727204184.40736: waiting for pending results... 41445 1727204184.41008: running TaskExecutor() for managed-node3/TASK: Gathering Facts 41445 1727204184.41013: in run() - task 028d2410-947f-bf02-eee4-000000000151 41445 1727204184.41016: variable 'ansible_search_path' from source: unknown 41445 1727204184.41043: calling self._execute() 41445 1727204184.41135: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204184.41146: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204184.41159: variable 'omit' from source: magic vars 41445 1727204184.41909: variable 'ansible_distribution_major_version' from source: facts 41445 1727204184.42018: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204184.42024: variable 'omit' from source: magic vars 41445 1727204184.42027: variable 'omit' from source: magic vars 41445 1727204184.42029: variable 'omit' from source: magic vars 41445 1727204184.42042: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41445 1727204184.42083: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41445 1727204184.42105: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41445 1727204184.42172: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204184.42193: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204184.42232: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41445 1727204184.42244: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204184.42252: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204184.42350: Set connection var ansible_shell_executable to /bin/sh 41445 1727204184.42357: Set connection var ansible_shell_type to sh 41445 1727204184.42365: Set connection var ansible_pipelining to False 41445 1727204184.42373: Set connection var ansible_timeout to 10 41445 1727204184.42382: Set connection var ansible_connection to ssh 41445 1727204184.42515: Set connection var ansible_module_compression to ZIP_DEFLATED 41445 1727204184.42783: variable 'ansible_shell_executable' from source: unknown 41445 1727204184.42787: variable 'ansible_connection' from source: unknown 41445 1727204184.42789: variable 'ansible_module_compression' from source: unknown 41445 1727204184.42792: variable 'ansible_shell_type' from source: unknown 41445 1727204184.42794: variable 'ansible_shell_executable' from source: unknown 41445 1727204184.42796: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204184.42798: variable 'ansible_pipelining' from source: unknown 41445 1727204184.42799: variable 'ansible_timeout' from source: unknown 41445 1727204184.42801: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204184.43133: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41445 1727204184.43150: variable 'omit' from source: magic vars 41445 1727204184.43160: starting attempt loop 41445 1727204184.43187: running the handler 41445 1727204184.43214: variable 'ansible_facts' from source: unknown 41445 1727204184.43326: _low_level_execute_command(): starting 41445 1727204184.43330: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41445 1727204184.44122: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204184.44136: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204184.44152: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204184.44213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204184.44232: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204184.44311: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204184.44364: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204184.44406: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204184.44567: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204184.46146: stdout chunk (state=3): >>>/root <<< 41445 1727204184.46325: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204184.46400: stdout chunk (state=3): >>><<< 41445 1727204184.46404: stderr chunk (state=3): >>><<< 41445 1727204184.46449: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204184.46641: _low_level_execute_command(): starting 41445 1727204184.46646: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204184.464737-41917-120946994721666 `" && echo ansible-tmp-1727204184.464737-41917-120946994721666="` echo /root/.ansible/tmp/ansible-tmp-1727204184.464737-41917-120946994721666 `" ) && sleep 0' 41445 1727204184.47871: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204184.47878: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration <<< 41445 1727204184.47888: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204184.48155: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204184.48166: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204184.48197: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204184.48337: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204184.50171: stdout chunk (state=3): >>>ansible-tmp-1727204184.464737-41917-120946994721666=/root/.ansible/tmp/ansible-tmp-1727204184.464737-41917-120946994721666 <<< 41445 1727204184.50293: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204184.50339: stderr chunk (state=3): >>><<< 41445 1727204184.50352: stdout chunk (state=3): >>><<< 41445 1727204184.50483: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204184.464737-41917-120946994721666=/root/.ansible/tmp/ansible-tmp-1727204184.464737-41917-120946994721666 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204184.50486: variable 'ansible_module_compression' from source: unknown 41445 1727204184.50565: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-414450s0ylvj1/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 41445 1727204184.50809: variable 'ansible_facts' from source: unknown 41445 1727204184.51189: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204184.464737-41917-120946994721666/AnsiballZ_setup.py 41445 1727204184.51441: Sending initial data 41445 1727204184.51450: Sent initial data (153 bytes) 41445 1727204184.52682: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204184.52710: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204184.52726: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204184.52741: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204184.52819: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204184.52947: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204184.53002: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204184.53145: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204184.54646: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41445 1727204184.54668: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41445 1727204184.54730: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-414450s0ylvj1/tmpszu3qmbx /root/.ansible/tmp/ansible-tmp-1727204184.464737-41917-120946994721666/AnsiballZ_setup.py <<< 41445 1727204184.54734: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204184.464737-41917-120946994721666/AnsiballZ_setup.py" <<< 41445 1727204184.54887: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-414450s0ylvj1/tmpszu3qmbx" to remote "/root/.ansible/tmp/ansible-tmp-1727204184.464737-41917-120946994721666/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204184.464737-41917-120946994721666/AnsiballZ_setup.py" <<< 41445 1727204184.56993: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204184.57029: stderr chunk (state=3): >>><<< 41445 1727204184.57054: stdout chunk (state=3): >>><<< 41445 1727204184.57081: done transferring module to remote 41445 1727204184.57097: _low_level_execute_command(): starting 41445 1727204184.57107: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204184.464737-41917-120946994721666/ /root/.ansible/tmp/ansible-tmp-1727204184.464737-41917-120946994721666/AnsiballZ_setup.py && sleep 0' 41445 1727204184.57738: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204184.57760: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204184.57774: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204184.57794: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204184.57864: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204184.57907: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204184.57925: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204184.57945: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204184.58013: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204184.60107: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204184.60112: stdout chunk (state=3): >>><<< 41445 1727204184.60114: stderr chunk (state=3): >>><<< 41445 1727204184.60287: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204184.60291: _low_level_execute_command(): starting 41445 1727204184.60295: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204184.464737-41917-120946994721666/AnsiballZ_setup.py && sleep 0' 41445 1727204184.60827: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204184.60841: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204184.60856: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204184.60871: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204184.60890: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 <<< 41445 1727204184.60902: stderr chunk (state=3): >>>debug2: match not found <<< 41445 1727204184.60918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204184.60998: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204184.61030: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204184.61049: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204184.61068: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204184.61142: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 41445 1727204185.44783: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec25272c602494034078bc876e25857f", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "56", "second": "25", "epoch": "1727204185", "epoch_int": "1727204185", "date": "2024-09-24", "time": "14:56:25", "iso8601_micro": "2024-09-24T18:56:25.021007Z", "iso8601": "2024-09-24T18:56:25Z", "iso8601_basic": "20240924T145625021007", "iso8601_basic_short": "20240924T145625", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2938, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 593, "free": 2938}, "nocache": {"free": 3278, "used": 253}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec25272c-6024-9403-4078-bc876e25857f", "ansible_product_uuid": "ec25272c-6024-9403-4078-bc876e25857f", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_<<< 41445 1727204185.44808: stdout chunk (state=3): >>>uptime_seconds": 762, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261788962816, "block_size": 4096, "block_total": 65519099, "block_available": 63913321, "block_used": 1605778, "inode_total": 131070960, "inode_available": 131027339, "inode_used": 43621, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC83yKiaGAYjfqsqlfpPMYFAYI2IZVpX8WgNfvPCdI/QOKjuzs4b6SbV/Bm0ogtp9Et9STTGoHBvp3tMYQ6i0y/9DHMBxtiHYJ+rIzJ/YNNMqKc+fMqexyOsi7rKKuzfxXrYU8uPBHq5WU3WAZmJxJn+UHZGog8hUnJ8momdJG+aYo9El3Qce4gVdwORcmHZUOa49M8lLCwTovtYArmkGETUVJ+Jk8huVTzYpASWxxcw6zOvUcn52HC6dmNQv/T+k2uW6UW0rybwIrVUlZXRNODrXs8kCGgOx1OI0XYB3FndJOnORF4A9Y6onLo/zUCEaO8Pi19mcfSbo2v+bmotTVk5jcmvR3jhVYJmJE6a+dQpjSZolSqMv8mI9tkztfxM6bJlNSZcTrvZEzu7cbiE38Pp/Ku143n9iGgWNmUQ2FhUDpoWEhoA767VUunE48P8ivpVZ/u5aEyupZSLEuWEvCLmmGPVcf9hVbcXw0n8RFvUwDdD8WARVhN5GJFUVN5JM0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHm1sMDuVWGACN5pHFDkl6tR93F90YCY4cFGcXcCoQnN+oT963FmBwTMMlfDIm4G2OUATCZuz6QFZP9trAaUzXo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIAB71QKijTWbanEvrb0ex0kLr0wX6qyv6naldRWNiIFP", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_is_chroot": false, "ansible_iscsi_iqn": "", "ansible_interfaces": ["lo", "rpltstbr", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "02:83:38:1a:ae:4d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.47.22", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::83:38ff:fe1a:ae4d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fi<<< 41445 1727204185.44831: stdout chunk (state=3): >>>xed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "26:cf:9a:9b:f7:ee", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fix<<< 41445 1727204185.44842: stdout chunk (state=3): >>>ed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.47.22", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:83:38:1a:ae:4d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.47.22", "192.0.2.72"], "ansible_all_ipv6_addresses": ["fe80::83:38ff:fe1a:ae4d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.47.22", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::83:38ff:fe1a:ae4d"]}, "ansible_loadavg": {"1m": 0.65087890625, "5m": 0.5322265625, "15m": 0.30810546875}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 53526 10.31.47.22 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 53526 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_lsb": {}, "ansible_fibre_channel_wwn": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fips": false, "ansible_local": {}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 41445 1727204185.47471: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. <<< 41445 1727204185.47505: stderr chunk (state=3): >>><<< 41445 1727204185.47511: stdout chunk (state=3): >>><<< 41445 1727204185.47543: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec25272c602494034078bc876e25857f", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "56", "second": "25", "epoch": "1727204185", "epoch_int": "1727204185", "date": "2024-09-24", "time": "14:56:25", "iso8601_micro": "2024-09-24T18:56:25.021007Z", "iso8601": "2024-09-24T18:56:25Z", "iso8601_basic": "20240924T145625021007", "iso8601_basic_short": "20240924T145625", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2938, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 593, "free": 2938}, "nocache": {"free": 3278, "used": 253}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec25272c-6024-9403-4078-bc876e25857f", "ansible_product_uuid": "ec25272c-6024-9403-4078-bc876e25857f", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 762, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261788962816, "block_size": 4096, "block_total": 65519099, "block_available": 63913321, "block_used": 1605778, "inode_total": 131070960, "inode_available": 131027339, "inode_used": 43621, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC83yKiaGAYjfqsqlfpPMYFAYI2IZVpX8WgNfvPCdI/QOKjuzs4b6SbV/Bm0ogtp9Et9STTGoHBvp3tMYQ6i0y/9DHMBxtiHYJ+rIzJ/YNNMqKc+fMqexyOsi7rKKuzfxXrYU8uPBHq5WU3WAZmJxJn+UHZGog8hUnJ8momdJG+aYo9El3Qce4gVdwORcmHZUOa49M8lLCwTovtYArmkGETUVJ+Jk8huVTzYpASWxxcw6zOvUcn52HC6dmNQv/T+k2uW6UW0rybwIrVUlZXRNODrXs8kCGgOx1OI0XYB3FndJOnORF4A9Y6onLo/zUCEaO8Pi19mcfSbo2v+bmotTVk5jcmvR3jhVYJmJE6a+dQpjSZolSqMv8mI9tkztfxM6bJlNSZcTrvZEzu7cbiE38Pp/Ku143n9iGgWNmUQ2FhUDpoWEhoA767VUunE48P8ivpVZ/u5aEyupZSLEuWEvCLmmGPVcf9hVbcXw0n8RFvUwDdD8WARVhN5GJFUVN5JM0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHm1sMDuVWGACN5pHFDkl6tR93F90YCY4cFGcXcCoQnN+oT963FmBwTMMlfDIm4G2OUATCZuz6QFZP9trAaUzXo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIAB71QKijTWbanEvrb0ex0kLr0wX6qyv6naldRWNiIFP", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_is_chroot": false, "ansible_iscsi_iqn": "", "ansible_interfaces": ["lo", "rpltstbr", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "02:83:38:1a:ae:4d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.47.22", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::83:38ff:fe1a:ae4d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "26:cf:9a:9b:f7:ee", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.47.22", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:83:38:1a:ae:4d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.47.22", "192.0.2.72"], "ansible_all_ipv6_addresses": ["fe80::83:38ff:fe1a:ae4d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.47.22", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::83:38ff:fe1a:ae4d"]}, "ansible_loadavg": {"1m": 0.65087890625, "5m": 0.5322265625, "15m": 0.30810546875}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 53526 10.31.47.22 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 53526 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_lsb": {}, "ansible_fibre_channel_wwn": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fips": false, "ansible_local": {}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. 41445 1727204185.47786: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204184.464737-41917-120946994721666/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41445 1727204185.47804: _low_level_execute_command(): starting 41445 1727204185.47811: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204184.464737-41917-120946994721666/ > /dev/null 2>&1 && sleep 0' 41445 1727204185.48267: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204185.48270: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found <<< 41445 1727204185.48273: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204185.48275: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204185.48280: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204185.48335: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204185.48342: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204185.48345: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204185.48378: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 41445 1727204185.50889: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204185.50926: stderr chunk (state=3): >>><<< 41445 1727204185.50929: stdout chunk (state=3): >>><<< 41445 1727204185.50939: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 41445 1727204185.50946: handler run complete 41445 1727204185.51035: variable 'ansible_facts' from source: unknown 41445 1727204185.51097: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204185.51295: variable 'ansible_facts' from source: unknown 41445 1727204185.51354: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204185.51433: attempt loop complete, returning result 41445 1727204185.51436: _execute() done 41445 1727204185.51439: dumping result to json 41445 1727204185.51460: done dumping result, returning 41445 1727204185.51480: done running TaskExecutor() for managed-node3/TASK: Gathering Facts [028d2410-947f-bf02-eee4-000000000151] 41445 1727204185.51490: sending task result for task 028d2410-947f-bf02-eee4-000000000151 41445 1727204185.52049: done sending task result for task 028d2410-947f-bf02-eee4-000000000151 41445 1727204185.52051: WORKER PROCESS EXITING ok: [managed-node3] 41445 1727204185.52529: no more pending results, returning what we have 41445 1727204185.52534: results queue empty 41445 1727204185.52535: checking for any_errors_fatal 41445 1727204185.52536: done checking for any_errors_fatal 41445 1727204185.52537: checking for max_fail_percentage 41445 1727204185.52538: done checking for max_fail_percentage 41445 1727204185.52539: checking to see if all hosts have failed and the running result is not ok 41445 1727204185.52540: done checking to see if all hosts have failed 41445 1727204185.52540: getting the remaining hosts for this loop 41445 1727204185.52541: done getting the remaining hosts for this loop 41445 1727204185.52545: getting the next task for host managed-node3 41445 1727204185.52549: done getting next task for host managed-node3 41445 1727204185.52550: ^ task is: TASK: meta (flush_handlers) 41445 1727204185.52551: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204185.52553: getting variables 41445 1727204185.52554: in VariableManager get_vars() 41445 1727204185.52583: Calling all_inventory to load vars for managed-node3 41445 1727204185.52585: Calling groups_inventory to load vars for managed-node3 41445 1727204185.52589: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204185.52599: Calling all_plugins_play to load vars for managed-node3 41445 1727204185.52602: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204185.52607: Calling groups_plugins_play to load vars for managed-node3 41445 1727204185.52733: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204185.52863: done with get_vars() 41445 1727204185.52872: done getting variables 41445 1727204185.52923: in VariableManager get_vars() 41445 1727204185.52933: Calling all_inventory to load vars for managed-node3 41445 1727204185.52934: Calling groups_inventory to load vars for managed-node3 41445 1727204185.52936: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204185.52938: Calling all_plugins_play to load vars for managed-node3 41445 1727204185.52940: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204185.52941: Calling groups_plugins_play to load vars for managed-node3 41445 1727204185.53030: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204185.53147: done with get_vars() 41445 1727204185.53156: done queuing things up, now waiting for results queue to drain 41445 1727204185.53157: results queue empty 41445 1727204185.53158: checking for any_errors_fatal 41445 1727204185.53160: done checking for any_errors_fatal 41445 1727204185.53160: checking for max_fail_percentage 41445 1727204185.53164: done checking for max_fail_percentage 41445 1727204185.53165: checking to see if all hosts have failed and the running result is not ok 41445 1727204185.53165: done checking to see if all hosts have failed 41445 1727204185.53165: getting the remaining hosts for this loop 41445 1727204185.53166: done getting the remaining hosts for this loop 41445 1727204185.53168: getting the next task for host managed-node3 41445 1727204185.53170: done getting next task for host managed-node3 41445 1727204185.53171: ^ task is: TASK: Set type={{ type }} and interface={{ interface }} 41445 1727204185.53172: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204185.53174: getting variables 41445 1727204185.53174: in VariableManager get_vars() 41445 1727204185.53185: Calling all_inventory to load vars for managed-node3 41445 1727204185.53187: Calling groups_inventory to load vars for managed-node3 41445 1727204185.53189: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204185.53192: Calling all_plugins_play to load vars for managed-node3 41445 1727204185.53194: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204185.53197: Calling groups_plugins_play to load vars for managed-node3 41445 1727204185.53280: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204185.53415: done with get_vars() 41445 1727204185.53420: done getting variables 41445 1727204185.53446: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 41445 1727204185.53546: variable 'type' from source: play vars 41445 1727204185.53550: variable 'interface' from source: play vars TASK [Set type=veth and interface=ethtest0] ************************************ task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:11 Tuesday 24 September 2024 14:56:25 -0400 (0:00:01.132) 0:00:04.323 ***** 41445 1727204185.53578: entering _queue_task() for managed-node3/set_fact 41445 1727204185.53785: worker is 1 (out of 1 available) 41445 1727204185.53800: exiting _queue_task() for managed-node3/set_fact 41445 1727204185.53810: done queuing things up, now waiting for results queue to drain 41445 1727204185.53811: waiting for pending results... 41445 1727204185.53964: running TaskExecutor() for managed-node3/TASK: Set type=veth and interface=ethtest0 41445 1727204185.54019: in run() - task 028d2410-947f-bf02-eee4-00000000000b 41445 1727204185.54029: variable 'ansible_search_path' from source: unknown 41445 1727204185.54057: calling self._execute() 41445 1727204185.54127: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204185.54131: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204185.54138: variable 'omit' from source: magic vars 41445 1727204185.54414: variable 'ansible_distribution_major_version' from source: facts 41445 1727204185.54422: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204185.54428: variable 'omit' from source: magic vars 41445 1727204185.54443: variable 'omit' from source: magic vars 41445 1727204185.54462: variable 'type' from source: play vars 41445 1727204185.54523: variable 'type' from source: play vars 41445 1727204185.54526: variable 'interface' from source: play vars 41445 1727204185.54780: variable 'interface' from source: play vars 41445 1727204185.54783: variable 'omit' from source: magic vars 41445 1727204185.54785: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41445 1727204185.54788: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41445 1727204185.54790: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41445 1727204185.54792: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204185.54794: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204185.54796: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41445 1727204185.54799: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204185.54800: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204185.54885: Set connection var ansible_shell_executable to /bin/sh 41445 1727204185.54893: Set connection var ansible_shell_type to sh 41445 1727204185.54903: Set connection var ansible_pipelining to False 41445 1727204185.54918: Set connection var ansible_timeout to 10 41445 1727204185.54924: Set connection var ansible_connection to ssh 41445 1727204185.54939: Set connection var ansible_module_compression to ZIP_DEFLATED 41445 1727204185.54962: variable 'ansible_shell_executable' from source: unknown 41445 1727204185.54965: variable 'ansible_connection' from source: unknown 41445 1727204185.54968: variable 'ansible_module_compression' from source: unknown 41445 1727204185.54970: variable 'ansible_shell_type' from source: unknown 41445 1727204185.54972: variable 'ansible_shell_executable' from source: unknown 41445 1727204185.54974: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204185.54979: variable 'ansible_pipelining' from source: unknown 41445 1727204185.54982: variable 'ansible_timeout' from source: unknown 41445 1727204185.54986: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204185.55107: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41445 1727204185.55281: variable 'omit' from source: magic vars 41445 1727204185.55283: starting attempt loop 41445 1727204185.55285: running the handler 41445 1727204185.55288: handler run complete 41445 1727204185.55289: attempt loop complete, returning result 41445 1727204185.55291: _execute() done 41445 1727204185.55292: dumping result to json 41445 1727204185.55294: done dumping result, returning 41445 1727204185.55296: done running TaskExecutor() for managed-node3/TASK: Set type=veth and interface=ethtest0 [028d2410-947f-bf02-eee4-00000000000b] 41445 1727204185.55298: sending task result for task 028d2410-947f-bf02-eee4-00000000000b 41445 1727204185.55354: done sending task result for task 028d2410-947f-bf02-eee4-00000000000b 41445 1727204185.55357: WORKER PROCESS EXITING ok: [managed-node3] => { "ansible_facts": { "interface": "ethtest0", "type": "veth" }, "changed": false } 41445 1727204185.55401: no more pending results, returning what we have 41445 1727204185.55404: results queue empty 41445 1727204185.55404: checking for any_errors_fatal 41445 1727204185.55406: done checking for any_errors_fatal 41445 1727204185.55407: checking for max_fail_percentage 41445 1727204185.55408: done checking for max_fail_percentage 41445 1727204185.55409: checking to see if all hosts have failed and the running result is not ok 41445 1727204185.55410: done checking to see if all hosts have failed 41445 1727204185.55411: getting the remaining hosts for this loop 41445 1727204185.55412: done getting the remaining hosts for this loop 41445 1727204185.55415: getting the next task for host managed-node3 41445 1727204185.55418: done getting next task for host managed-node3 41445 1727204185.55421: ^ task is: TASK: Include the task 'show_interfaces.yml' 41445 1727204185.55422: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204185.55425: getting variables 41445 1727204185.55426: in VariableManager get_vars() 41445 1727204185.55458: Calling all_inventory to load vars for managed-node3 41445 1727204185.55460: Calling groups_inventory to load vars for managed-node3 41445 1727204185.55462: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204185.55470: Calling all_plugins_play to load vars for managed-node3 41445 1727204185.55472: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204185.55474: Calling groups_plugins_play to load vars for managed-node3 41445 1727204185.55753: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204185.55970: done with get_vars() 41445 1727204185.55983: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:15 Tuesday 24 September 2024 14:56:25 -0400 (0:00:00.024) 0:00:04.348 ***** 41445 1727204185.56068: entering _queue_task() for managed-node3/include_tasks 41445 1727204185.56313: worker is 1 (out of 1 available) 41445 1727204185.56324: exiting _queue_task() for managed-node3/include_tasks 41445 1727204185.56333: done queuing things up, now waiting for results queue to drain 41445 1727204185.56334: waiting for pending results... 41445 1727204185.56583: running TaskExecutor() for managed-node3/TASK: Include the task 'show_interfaces.yml' 41445 1727204185.56691: in run() - task 028d2410-947f-bf02-eee4-00000000000c 41445 1727204185.56782: variable 'ansible_search_path' from source: unknown 41445 1727204185.56785: calling self._execute() 41445 1727204185.56850: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204185.56862: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204185.56874: variable 'omit' from source: magic vars 41445 1727204185.57254: variable 'ansible_distribution_major_version' from source: facts 41445 1727204185.57274: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204185.57288: _execute() done 41445 1727204185.57296: dumping result to json 41445 1727204185.57335: done dumping result, returning 41445 1727204185.57339: done running TaskExecutor() for managed-node3/TASK: Include the task 'show_interfaces.yml' [028d2410-947f-bf02-eee4-00000000000c] 41445 1727204185.57341: sending task result for task 028d2410-947f-bf02-eee4-00000000000c 41445 1727204185.57504: no more pending results, returning what we have 41445 1727204185.57513: in VariableManager get_vars() 41445 1727204185.57560: Calling all_inventory to load vars for managed-node3 41445 1727204185.57563: Calling groups_inventory to load vars for managed-node3 41445 1727204185.57565: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204185.57581: Calling all_plugins_play to load vars for managed-node3 41445 1727204185.57584: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204185.57588: Calling groups_plugins_play to load vars for managed-node3 41445 1727204185.57994: done sending task result for task 028d2410-947f-bf02-eee4-00000000000c 41445 1727204185.57998: WORKER PROCESS EXITING 41445 1727204185.58024: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204185.58229: done with get_vars() 41445 1727204185.58235: variable 'ansible_search_path' from source: unknown 41445 1727204185.58245: we have included files to process 41445 1727204185.58246: generating all_blocks data 41445 1727204185.58247: done generating all_blocks data 41445 1727204185.58248: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 41445 1727204185.58249: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 41445 1727204185.58250: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 41445 1727204185.58385: in VariableManager get_vars() 41445 1727204185.58405: done with get_vars() 41445 1727204185.58512: done processing included file 41445 1727204185.58514: iterating over new_blocks loaded from include file 41445 1727204185.58516: in VariableManager get_vars() 41445 1727204185.58530: done with get_vars() 41445 1727204185.58531: filtering new block on tags 41445 1727204185.58545: done filtering new block on tags 41445 1727204185.58547: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node3 41445 1727204185.58551: extending task lists for all hosts with included blocks 41445 1727204185.60524: done extending task lists 41445 1727204185.60526: done processing included files 41445 1727204185.60527: results queue empty 41445 1727204185.60527: checking for any_errors_fatal 41445 1727204185.60530: done checking for any_errors_fatal 41445 1727204185.60531: checking for max_fail_percentage 41445 1727204185.60532: done checking for max_fail_percentage 41445 1727204185.60533: checking to see if all hosts have failed and the running result is not ok 41445 1727204185.60534: done checking to see if all hosts have failed 41445 1727204185.60534: getting the remaining hosts for this loop 41445 1727204185.60535: done getting the remaining hosts for this loop 41445 1727204185.60538: getting the next task for host managed-node3 41445 1727204185.60542: done getting next task for host managed-node3 41445 1727204185.60544: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 41445 1727204185.60546: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204185.60549: getting variables 41445 1727204185.60550: in VariableManager get_vars() 41445 1727204185.60565: Calling all_inventory to load vars for managed-node3 41445 1727204185.60567: Calling groups_inventory to load vars for managed-node3 41445 1727204185.60569: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204185.60577: Calling all_plugins_play to load vars for managed-node3 41445 1727204185.60580: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204185.60583: Calling groups_plugins_play to load vars for managed-node3 41445 1727204185.60741: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204185.61133: done with get_vars() 41445 1727204185.61142: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 14:56:25 -0400 (0:00:00.051) 0:00:04.399 ***** 41445 1727204185.61215: entering _queue_task() for managed-node3/include_tasks 41445 1727204185.61492: worker is 1 (out of 1 available) 41445 1727204185.61506: exiting _queue_task() for managed-node3/include_tasks 41445 1727204185.61520: done queuing things up, now waiting for results queue to drain 41445 1727204185.61521: waiting for pending results... 41445 1727204185.61799: running TaskExecutor() for managed-node3/TASK: Include the task 'get_current_interfaces.yml' 41445 1727204185.61906: in run() - task 028d2410-947f-bf02-eee4-000000000169 41445 1727204185.61932: variable 'ansible_search_path' from source: unknown 41445 1727204185.61940: variable 'ansible_search_path' from source: unknown 41445 1727204185.61988: calling self._execute() 41445 1727204185.62088: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204185.62100: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204185.62119: variable 'omit' from source: magic vars 41445 1727204185.62511: variable 'ansible_distribution_major_version' from source: facts 41445 1727204185.62531: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204185.62541: _execute() done 41445 1727204185.62549: dumping result to json 41445 1727204185.62557: done dumping result, returning 41445 1727204185.62567: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_current_interfaces.yml' [028d2410-947f-bf02-eee4-000000000169] 41445 1727204185.62580: sending task result for task 028d2410-947f-bf02-eee4-000000000169 41445 1727204185.62712: no more pending results, returning what we have 41445 1727204185.62717: in VariableManager get_vars() 41445 1727204185.62765: Calling all_inventory to load vars for managed-node3 41445 1727204185.62768: Calling groups_inventory to load vars for managed-node3 41445 1727204185.62771: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204185.62786: Calling all_plugins_play to load vars for managed-node3 41445 1727204185.62790: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204185.62793: Calling groups_plugins_play to load vars for managed-node3 41445 1727204185.63228: done sending task result for task 028d2410-947f-bf02-eee4-000000000169 41445 1727204185.63231: WORKER PROCESS EXITING 41445 1727204185.63256: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204185.63456: done with get_vars() 41445 1727204185.63464: variable 'ansible_search_path' from source: unknown 41445 1727204185.63465: variable 'ansible_search_path' from source: unknown 41445 1727204185.63503: we have included files to process 41445 1727204185.63504: generating all_blocks data 41445 1727204185.63506: done generating all_blocks data 41445 1727204185.63507: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 41445 1727204185.63511: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 41445 1727204185.63513: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 41445 1727204185.63814: done processing included file 41445 1727204185.63816: iterating over new_blocks loaded from include file 41445 1727204185.63817: in VariableManager get_vars() 41445 1727204185.63836: done with get_vars() 41445 1727204185.63838: filtering new block on tags 41445 1727204185.63854: done filtering new block on tags 41445 1727204185.63856: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node3 41445 1727204185.63860: extending task lists for all hosts with included blocks 41445 1727204185.63963: done extending task lists 41445 1727204185.63964: done processing included files 41445 1727204185.63965: results queue empty 41445 1727204185.63965: checking for any_errors_fatal 41445 1727204185.63968: done checking for any_errors_fatal 41445 1727204185.63968: checking for max_fail_percentage 41445 1727204185.63969: done checking for max_fail_percentage 41445 1727204185.63970: checking to see if all hosts have failed and the running result is not ok 41445 1727204185.63971: done checking to see if all hosts have failed 41445 1727204185.63972: getting the remaining hosts for this loop 41445 1727204185.63973: done getting the remaining hosts for this loop 41445 1727204185.63975: getting the next task for host managed-node3 41445 1727204185.63980: done getting next task for host managed-node3 41445 1727204185.63982: ^ task is: TASK: Gather current interface info 41445 1727204185.63985: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204185.63987: getting variables 41445 1727204185.63988: in VariableManager get_vars() 41445 1727204185.64000: Calling all_inventory to load vars for managed-node3 41445 1727204185.64002: Calling groups_inventory to load vars for managed-node3 41445 1727204185.64004: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204185.64011: Calling all_plugins_play to load vars for managed-node3 41445 1727204185.64013: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204185.64016: Calling groups_plugins_play to load vars for managed-node3 41445 1727204185.64191: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204185.64392: done with get_vars() 41445 1727204185.64401: done getting variables 41445 1727204185.64437: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 14:56:25 -0400 (0:00:00.032) 0:00:04.432 ***** 41445 1727204185.64461: entering _queue_task() for managed-node3/command 41445 1727204185.64708: worker is 1 (out of 1 available) 41445 1727204185.64723: exiting _queue_task() for managed-node3/command 41445 1727204185.64735: done queuing things up, now waiting for results queue to drain 41445 1727204185.64736: waiting for pending results... 41445 1727204185.64990: running TaskExecutor() for managed-node3/TASK: Gather current interface info 41445 1727204185.65114: in run() - task 028d2410-947f-bf02-eee4-00000000024e 41445 1727204185.65133: variable 'ansible_search_path' from source: unknown 41445 1727204185.65141: variable 'ansible_search_path' from source: unknown 41445 1727204185.65182: calling self._execute() 41445 1727204185.65275: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204185.65290: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204185.65326: variable 'omit' from source: magic vars 41445 1727204185.65689: variable 'ansible_distribution_major_version' from source: facts 41445 1727204185.65884: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204185.65888: variable 'omit' from source: magic vars 41445 1727204185.65890: variable 'omit' from source: magic vars 41445 1727204185.65893: variable 'omit' from source: magic vars 41445 1727204185.65895: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41445 1727204185.65898: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41445 1727204185.65918: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41445 1727204185.65943: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204185.65960: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204185.65995: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41445 1727204185.66023: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204185.66030: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204185.66240: Set connection var ansible_shell_executable to /bin/sh 41445 1727204185.66248: Set connection var ansible_shell_type to sh 41445 1727204185.66259: Set connection var ansible_pipelining to False 41445 1727204185.66272: Set connection var ansible_timeout to 10 41445 1727204185.66451: Set connection var ansible_connection to ssh 41445 1727204185.66453: Set connection var ansible_module_compression to ZIP_DEFLATED 41445 1727204185.66456: variable 'ansible_shell_executable' from source: unknown 41445 1727204185.66458: variable 'ansible_connection' from source: unknown 41445 1727204185.66460: variable 'ansible_module_compression' from source: unknown 41445 1727204185.66462: variable 'ansible_shell_type' from source: unknown 41445 1727204185.66464: variable 'ansible_shell_executable' from source: unknown 41445 1727204185.66466: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204185.66467: variable 'ansible_pipelining' from source: unknown 41445 1727204185.66469: variable 'ansible_timeout' from source: unknown 41445 1727204185.66471: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204185.66725: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41445 1727204185.66741: variable 'omit' from source: magic vars 41445 1727204185.66751: starting attempt loop 41445 1727204185.66757: running the handler 41445 1727204185.66987: _low_level_execute_command(): starting 41445 1727204185.66990: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41445 1727204185.68111: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204185.68164: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 <<< 41445 1727204185.68171: stderr chunk (state=3): >>>debug2: match found <<< 41445 1727204185.68243: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204185.68267: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204185.68285: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204185.68360: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 41445 1727204185.70753: stdout chunk (state=3): >>>/root <<< 41445 1727204185.71028: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204185.71032: stdout chunk (state=3): >>><<< 41445 1727204185.71041: stderr chunk (state=3): >>><<< 41445 1727204185.71063: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 41445 1727204185.71079: _low_level_execute_command(): starting 41445 1727204185.71086: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204185.7106369-41976-274102551498703 `" && echo ansible-tmp-1727204185.7106369-41976-274102551498703="` echo /root/.ansible/tmp/ansible-tmp-1727204185.7106369-41976-274102551498703 `" ) && sleep 0' 41445 1727204185.72392: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204185.72413: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204185.72481: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204185.72489: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 41445 1727204185.75245: stdout chunk (state=3): >>>ansible-tmp-1727204185.7106369-41976-274102551498703=/root/.ansible/tmp/ansible-tmp-1727204185.7106369-41976-274102551498703 <<< 41445 1727204185.75389: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204185.75432: stderr chunk (state=3): >>><<< 41445 1727204185.75436: stdout chunk (state=3): >>><<< 41445 1727204185.75461: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204185.7106369-41976-274102551498703=/root/.ansible/tmp/ansible-tmp-1727204185.7106369-41976-274102551498703 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 41445 1727204185.75505: variable 'ansible_module_compression' from source: unknown 41445 1727204185.75573: ANSIBALLZ: Using generic lock for ansible.legacy.command 41445 1727204185.75659: ANSIBALLZ: Acquiring lock 41445 1727204185.75667: ANSIBALLZ: Lock acquired: 140182283768784 41445 1727204185.75679: ANSIBALLZ: Creating module 41445 1727204185.93094: ANSIBALLZ: Writing module into payload 41445 1727204185.93222: ANSIBALLZ: Writing module 41445 1727204185.93258: ANSIBALLZ: Renaming module 41445 1727204185.93273: ANSIBALLZ: Done creating module 41445 1727204185.93298: variable 'ansible_facts' from source: unknown 41445 1727204185.93454: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204185.7106369-41976-274102551498703/AnsiballZ_command.py 41445 1727204185.93562: Sending initial data 41445 1727204185.93565: Sent initial data (156 bytes) 41445 1727204185.94346: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address <<< 41445 1727204185.94349: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204185.94577: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204185.94643: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 41445 1727204185.96886: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 41445 1727204185.96896: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41445 1727204185.96928: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41445 1727204185.96962: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-414450s0ylvj1/tmpe6kz1ehw /root/.ansible/tmp/ansible-tmp-1727204185.7106369-41976-274102551498703/AnsiballZ_command.py <<< 41445 1727204185.96971: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204185.7106369-41976-274102551498703/AnsiballZ_command.py" <<< 41445 1727204185.96998: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-414450s0ylvj1/tmpe6kz1ehw" to remote "/root/.ansible/tmp/ansible-tmp-1727204185.7106369-41976-274102551498703/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204185.7106369-41976-274102551498703/AnsiballZ_command.py" <<< 41445 1727204185.97503: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204185.97540: stderr chunk (state=3): >>><<< 41445 1727204185.97544: stdout chunk (state=3): >>><<< 41445 1727204185.97578: done transferring module to remote 41445 1727204185.97588: _low_level_execute_command(): starting 41445 1727204185.97593: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204185.7106369-41976-274102551498703/ /root/.ansible/tmp/ansible-tmp-1727204185.7106369-41976-274102551498703/AnsiballZ_command.py && sleep 0' 41445 1727204185.98558: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204185.98717: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204185.98752: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 41445 1727204186.01223: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204186.01230: stderr chunk (state=3): >>><<< 41445 1727204186.01237: stdout chunk (state=3): >>><<< 41445 1727204186.01247: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 41445 1727204186.01250: _low_level_execute_command(): starting 41445 1727204186.01256: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204185.7106369-41976-274102551498703/AnsiballZ_command.py && sleep 0' 41445 1727204186.01655: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204186.01673: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204186.01680: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204186.01693: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204186.01745: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204186.01753: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204186.01804: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 41445 1727204186.22901: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 14:56:26.221532", "end": "2024-09-24 14:56:26.224747", "delta": "0:00:00.003215", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 41445 1727204186.25000: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204186.25017: stderr chunk (state=3): >>>Shared connection to 10.31.47.22 closed. <<< 41445 1727204186.25070: stderr chunk (state=3): >>><<< 41445 1727204186.25085: stdout chunk (state=3): >>><<< 41445 1727204186.25118: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 14:56:26.221532", "end": "2024-09-24 14:56:26.224747", "delta": "0:00:00.003215", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. 41445 1727204186.25161: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204185.7106369-41976-274102551498703/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41445 1727204186.25173: _low_level_execute_command(): starting 41445 1727204186.25184: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204185.7106369-41976-274102551498703/ > /dev/null 2>&1 && sleep 0' 41445 1727204186.25800: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204186.25819: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204186.25834: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204186.25852: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204186.25870: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 <<< 41445 1727204186.25892: stderr chunk (state=3): >>>debug2: match not found <<< 41445 1727204186.25910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204186.25932: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41445 1727204186.25945: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.22 is address <<< 41445 1727204186.25957: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41445 1727204186.25967: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204186.26055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204186.26079: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204186.26103: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204186.26158: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204186.28501: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204186.28566: stderr chunk (state=3): >>><<< 41445 1727204186.28578: stdout chunk (state=3): >>><<< 41445 1727204186.28601: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204186.28622: handler run complete 41445 1727204186.28649: Evaluated conditional (False): False 41445 1727204186.28665: attempt loop complete, returning result 41445 1727204186.28678: _execute() done 41445 1727204186.28685: dumping result to json 41445 1727204186.28692: done dumping result, returning 41445 1727204186.28703: done running TaskExecutor() for managed-node3/TASK: Gather current interface info [028d2410-947f-bf02-eee4-00000000024e] 41445 1727204186.28714: sending task result for task 028d2410-947f-bf02-eee4-00000000024e ok: [managed-node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003215", "end": "2024-09-24 14:56:26.224747", "rc": 0, "start": "2024-09-24 14:56:26.221532" } STDOUT: bonding_masters eth0 lo rpltstbr 41445 1727204186.29164: no more pending results, returning what we have 41445 1727204186.29168: results queue empty 41445 1727204186.29169: checking for any_errors_fatal 41445 1727204186.29170: done checking for any_errors_fatal 41445 1727204186.29171: checking for max_fail_percentage 41445 1727204186.29173: done checking for max_fail_percentage 41445 1727204186.29174: checking to see if all hosts have failed and the running result is not ok 41445 1727204186.29174: done checking to see if all hosts have failed 41445 1727204186.29177: getting the remaining hosts for this loop 41445 1727204186.29179: done getting the remaining hosts for this loop 41445 1727204186.29183: getting the next task for host managed-node3 41445 1727204186.29191: done getting next task for host managed-node3 41445 1727204186.29194: ^ task is: TASK: Set current_interfaces 41445 1727204186.29198: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204186.29202: getting variables 41445 1727204186.29204: in VariableManager get_vars() 41445 1727204186.29251: Calling all_inventory to load vars for managed-node3 41445 1727204186.29255: Calling groups_inventory to load vars for managed-node3 41445 1727204186.29257: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204186.29269: Calling all_plugins_play to load vars for managed-node3 41445 1727204186.29272: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204186.29284: Calling groups_plugins_play to load vars for managed-node3 41445 1727204186.29296: done sending task result for task 028d2410-947f-bf02-eee4-00000000024e 41445 1727204186.29299: WORKER PROCESS EXITING 41445 1727204186.29737: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204186.30048: done with get_vars() 41445 1727204186.30061: done getting variables 41445 1727204186.30119: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 14:56:26 -0400 (0:00:00.656) 0:00:05.089 ***** 41445 1727204186.30145: entering _queue_task() for managed-node3/set_fact 41445 1727204186.30439: worker is 1 (out of 1 available) 41445 1727204186.30450: exiting _queue_task() for managed-node3/set_fact 41445 1727204186.30461: done queuing things up, now waiting for results queue to drain 41445 1727204186.30463: waiting for pending results... 41445 1727204186.30738: running TaskExecutor() for managed-node3/TASK: Set current_interfaces 41445 1727204186.30860: in run() - task 028d2410-947f-bf02-eee4-00000000024f 41445 1727204186.30882: variable 'ansible_search_path' from source: unknown 41445 1727204186.30890: variable 'ansible_search_path' from source: unknown 41445 1727204186.30942: calling self._execute() 41445 1727204186.31047: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204186.31058: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204186.31072: variable 'omit' from source: magic vars 41445 1727204186.31489: variable 'ansible_distribution_major_version' from source: facts 41445 1727204186.31511: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204186.31524: variable 'omit' from source: magic vars 41445 1727204186.31579: variable 'omit' from source: magic vars 41445 1727204186.31685: variable '_current_interfaces' from source: set_fact 41445 1727204186.31758: variable 'omit' from source: magic vars 41445 1727204186.31820: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41445 1727204186.31863: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41445 1727204186.31898: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41445 1727204186.31928: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204186.31946: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204186.32013: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41445 1727204186.32022: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204186.32025: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204186.32129: Set connection var ansible_shell_executable to /bin/sh 41445 1727204186.32228: Set connection var ansible_shell_type to sh 41445 1727204186.32232: Set connection var ansible_pipelining to False 41445 1727204186.32236: Set connection var ansible_timeout to 10 41445 1727204186.32238: Set connection var ansible_connection to ssh 41445 1727204186.32240: Set connection var ansible_module_compression to ZIP_DEFLATED 41445 1727204186.32243: variable 'ansible_shell_executable' from source: unknown 41445 1727204186.32245: variable 'ansible_connection' from source: unknown 41445 1727204186.32247: variable 'ansible_module_compression' from source: unknown 41445 1727204186.32249: variable 'ansible_shell_type' from source: unknown 41445 1727204186.32251: variable 'ansible_shell_executable' from source: unknown 41445 1727204186.32253: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204186.32255: variable 'ansible_pipelining' from source: unknown 41445 1727204186.32257: variable 'ansible_timeout' from source: unknown 41445 1727204186.32259: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204186.32446: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41445 1727204186.32449: variable 'omit' from source: magic vars 41445 1727204186.32452: starting attempt loop 41445 1727204186.32454: running the handler 41445 1727204186.32482: handler run complete 41445 1727204186.32485: attempt loop complete, returning result 41445 1727204186.32487: _execute() done 41445 1727204186.32492: dumping result to json 41445 1727204186.32553: done dumping result, returning 41445 1727204186.32557: done running TaskExecutor() for managed-node3/TASK: Set current_interfaces [028d2410-947f-bf02-eee4-00000000024f] 41445 1727204186.32560: sending task result for task 028d2410-947f-bf02-eee4-00000000024f 41445 1727204186.32880: done sending task result for task 028d2410-947f-bf02-eee4-00000000024f 41445 1727204186.32884: WORKER PROCESS EXITING ok: [managed-node3] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo", "rpltstbr" ] }, "changed": false } 41445 1727204186.32944: no more pending results, returning what we have 41445 1727204186.32947: results queue empty 41445 1727204186.32948: checking for any_errors_fatal 41445 1727204186.32956: done checking for any_errors_fatal 41445 1727204186.32956: checking for max_fail_percentage 41445 1727204186.32958: done checking for max_fail_percentage 41445 1727204186.32959: checking to see if all hosts have failed and the running result is not ok 41445 1727204186.32959: done checking to see if all hosts have failed 41445 1727204186.32960: getting the remaining hosts for this loop 41445 1727204186.32961: done getting the remaining hosts for this loop 41445 1727204186.32965: getting the next task for host managed-node3 41445 1727204186.32972: done getting next task for host managed-node3 41445 1727204186.32977: ^ task is: TASK: Show current_interfaces 41445 1727204186.32979: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204186.32984: getting variables 41445 1727204186.32985: in VariableManager get_vars() 41445 1727204186.33026: Calling all_inventory to load vars for managed-node3 41445 1727204186.33029: Calling groups_inventory to load vars for managed-node3 41445 1727204186.33032: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204186.33042: Calling all_plugins_play to load vars for managed-node3 41445 1727204186.33045: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204186.33048: Calling groups_plugins_play to load vars for managed-node3 41445 1727204186.33302: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204186.33536: done with get_vars() 41445 1727204186.33548: done getting variables 41445 1727204186.33653: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 14:56:26 -0400 (0:00:00.035) 0:00:05.124 ***** 41445 1727204186.33684: entering _queue_task() for managed-node3/debug 41445 1727204186.33685: Creating lock for debug 41445 1727204186.34086: worker is 1 (out of 1 available) 41445 1727204186.34096: exiting _queue_task() for managed-node3/debug 41445 1727204186.34106: done queuing things up, now waiting for results queue to drain 41445 1727204186.34107: waiting for pending results... 41445 1727204186.34261: running TaskExecutor() for managed-node3/TASK: Show current_interfaces 41445 1727204186.34388: in run() - task 028d2410-947f-bf02-eee4-00000000016a 41445 1727204186.34407: variable 'ansible_search_path' from source: unknown 41445 1727204186.34414: variable 'ansible_search_path' from source: unknown 41445 1727204186.34478: calling self._execute() 41445 1727204186.34530: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204186.34534: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204186.34542: variable 'omit' from source: magic vars 41445 1727204186.34817: variable 'ansible_distribution_major_version' from source: facts 41445 1727204186.34831: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204186.34835: variable 'omit' from source: magic vars 41445 1727204186.34858: variable 'omit' from source: magic vars 41445 1727204186.34926: variable 'current_interfaces' from source: set_fact 41445 1727204186.34950: variable 'omit' from source: magic vars 41445 1727204186.34983: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41445 1727204186.35009: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41445 1727204186.35027: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41445 1727204186.35042: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204186.35051: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204186.35076: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41445 1727204186.35080: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204186.35082: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204186.35149: Set connection var ansible_shell_executable to /bin/sh 41445 1727204186.35154: Set connection var ansible_shell_type to sh 41445 1727204186.35156: Set connection var ansible_pipelining to False 41445 1727204186.35168: Set connection var ansible_timeout to 10 41445 1727204186.35171: Set connection var ansible_connection to ssh 41445 1727204186.35173: Set connection var ansible_module_compression to ZIP_DEFLATED 41445 1727204186.35193: variable 'ansible_shell_executable' from source: unknown 41445 1727204186.35196: variable 'ansible_connection' from source: unknown 41445 1727204186.35199: variable 'ansible_module_compression' from source: unknown 41445 1727204186.35201: variable 'ansible_shell_type' from source: unknown 41445 1727204186.35204: variable 'ansible_shell_executable' from source: unknown 41445 1727204186.35206: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204186.35208: variable 'ansible_pipelining' from source: unknown 41445 1727204186.35210: variable 'ansible_timeout' from source: unknown 41445 1727204186.35217: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204186.35319: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41445 1727204186.35326: variable 'omit' from source: magic vars 41445 1727204186.35331: starting attempt loop 41445 1727204186.35334: running the handler 41445 1727204186.35368: handler run complete 41445 1727204186.35383: attempt loop complete, returning result 41445 1727204186.35386: _execute() done 41445 1727204186.35389: dumping result to json 41445 1727204186.35391: done dumping result, returning 41445 1727204186.35394: done running TaskExecutor() for managed-node3/TASK: Show current_interfaces [028d2410-947f-bf02-eee4-00000000016a] 41445 1727204186.35400: sending task result for task 028d2410-947f-bf02-eee4-00000000016a 41445 1727204186.35478: done sending task result for task 028d2410-947f-bf02-eee4-00000000016a 41445 1727204186.35481: WORKER PROCESS EXITING ok: [managed-node3] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo', 'rpltstbr'] 41445 1727204186.35531: no more pending results, returning what we have 41445 1727204186.35534: results queue empty 41445 1727204186.35535: checking for any_errors_fatal 41445 1727204186.35538: done checking for any_errors_fatal 41445 1727204186.35539: checking for max_fail_percentage 41445 1727204186.35540: done checking for max_fail_percentage 41445 1727204186.35541: checking to see if all hosts have failed and the running result is not ok 41445 1727204186.35542: done checking to see if all hosts have failed 41445 1727204186.35543: getting the remaining hosts for this loop 41445 1727204186.35544: done getting the remaining hosts for this loop 41445 1727204186.35548: getting the next task for host managed-node3 41445 1727204186.35554: done getting next task for host managed-node3 41445 1727204186.35557: ^ task is: TASK: Include the task 'manage_test_interface.yml' 41445 1727204186.35559: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204186.35563: getting variables 41445 1727204186.35565: in VariableManager get_vars() 41445 1727204186.35603: Calling all_inventory to load vars for managed-node3 41445 1727204186.35606: Calling groups_inventory to load vars for managed-node3 41445 1727204186.35608: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204186.35616: Calling all_plugins_play to load vars for managed-node3 41445 1727204186.35619: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204186.35621: Calling groups_plugins_play to load vars for managed-node3 41445 1727204186.35796: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204186.35924: done with get_vars() 41445 1727204186.35932: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:17 Tuesday 24 September 2024 14:56:26 -0400 (0:00:00.023) 0:00:05.147 ***** 41445 1727204186.35991: entering _queue_task() for managed-node3/include_tasks 41445 1727204186.36187: worker is 1 (out of 1 available) 41445 1727204186.36205: exiting _queue_task() for managed-node3/include_tasks 41445 1727204186.36217: done queuing things up, now waiting for results queue to drain 41445 1727204186.36219: waiting for pending results... 41445 1727204186.36500: running TaskExecutor() for managed-node3/TASK: Include the task 'manage_test_interface.yml' 41445 1727204186.36504: in run() - task 028d2410-947f-bf02-eee4-00000000000d 41445 1727204186.36525: variable 'ansible_search_path' from source: unknown 41445 1727204186.36566: calling self._execute() 41445 1727204186.36668: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204186.36682: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204186.36708: variable 'omit' from source: magic vars 41445 1727204186.37071: variable 'ansible_distribution_major_version' from source: facts 41445 1727204186.37091: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204186.37104: _execute() done 41445 1727204186.37112: dumping result to json 41445 1727204186.37121: done dumping result, returning 41445 1727204186.37139: done running TaskExecutor() for managed-node3/TASK: Include the task 'manage_test_interface.yml' [028d2410-947f-bf02-eee4-00000000000d] 41445 1727204186.37155: sending task result for task 028d2410-947f-bf02-eee4-00000000000d 41445 1727204186.37367: done sending task result for task 028d2410-947f-bf02-eee4-00000000000d 41445 1727204186.37371: WORKER PROCESS EXITING 41445 1727204186.37402: no more pending results, returning what we have 41445 1727204186.37407: in VariableManager get_vars() 41445 1727204186.37454: Calling all_inventory to load vars for managed-node3 41445 1727204186.37468: Calling groups_inventory to load vars for managed-node3 41445 1727204186.37472: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204186.37483: Calling all_plugins_play to load vars for managed-node3 41445 1727204186.37485: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204186.37487: Calling groups_plugins_play to load vars for managed-node3 41445 1727204186.37636: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204186.37767: done with get_vars() 41445 1727204186.37773: variable 'ansible_search_path' from source: unknown 41445 1727204186.37783: we have included files to process 41445 1727204186.37786: generating all_blocks data 41445 1727204186.37787: done generating all_blocks data 41445 1727204186.37792: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 41445 1727204186.37793: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 41445 1727204186.37795: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 41445 1727204186.38133: in VariableManager get_vars() 41445 1727204186.38148: done with get_vars() 41445 1727204186.38293: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 41445 1727204186.38657: done processing included file 41445 1727204186.38659: iterating over new_blocks loaded from include file 41445 1727204186.38660: in VariableManager get_vars() 41445 1727204186.38671: done with get_vars() 41445 1727204186.38672: filtering new block on tags 41445 1727204186.38693: done filtering new block on tags 41445 1727204186.38695: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed-node3 41445 1727204186.38707: extending task lists for all hosts with included blocks 41445 1727204186.39880: done extending task lists 41445 1727204186.39881: done processing included files 41445 1727204186.39882: results queue empty 41445 1727204186.39883: checking for any_errors_fatal 41445 1727204186.39885: done checking for any_errors_fatal 41445 1727204186.39886: checking for max_fail_percentage 41445 1727204186.39887: done checking for max_fail_percentage 41445 1727204186.39888: checking to see if all hosts have failed and the running result is not ok 41445 1727204186.39888: done checking to see if all hosts have failed 41445 1727204186.39889: getting the remaining hosts for this loop 41445 1727204186.39891: done getting the remaining hosts for this loop 41445 1727204186.39893: getting the next task for host managed-node3 41445 1727204186.39896: done getting next task for host managed-node3 41445 1727204186.39898: ^ task is: TASK: Ensure state in ["present", "absent"] 41445 1727204186.39901: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204186.39904: getting variables 41445 1727204186.39905: in VariableManager get_vars() 41445 1727204186.39917: Calling all_inventory to load vars for managed-node3 41445 1727204186.39919: Calling groups_inventory to load vars for managed-node3 41445 1727204186.39921: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204186.39926: Calling all_plugins_play to load vars for managed-node3 41445 1727204186.39929: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204186.39932: Calling groups_plugins_play to load vars for managed-node3 41445 1727204186.40082: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204186.40275: done with get_vars() 41445 1727204186.40286: done getting variables 41445 1727204186.40347: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Tuesday 24 September 2024 14:56:26 -0400 (0:00:00.043) 0:00:05.191 ***** 41445 1727204186.40380: entering _queue_task() for managed-node3/fail 41445 1727204186.40382: Creating lock for fail 41445 1727204186.40646: worker is 1 (out of 1 available) 41445 1727204186.40659: exiting _queue_task() for managed-node3/fail 41445 1727204186.40669: done queuing things up, now waiting for results queue to drain 41445 1727204186.40670: waiting for pending results... 41445 1727204186.41094: running TaskExecutor() for managed-node3/TASK: Ensure state in ["present", "absent"] 41445 1727204186.41099: in run() - task 028d2410-947f-bf02-eee4-00000000026a 41445 1727204186.41104: variable 'ansible_search_path' from source: unknown 41445 1727204186.41107: variable 'ansible_search_path' from source: unknown 41445 1727204186.41110: calling self._execute() 41445 1727204186.41154: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204186.41166: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204186.41183: variable 'omit' from source: magic vars 41445 1727204186.41526: variable 'ansible_distribution_major_version' from source: facts 41445 1727204186.41532: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204186.41616: variable 'state' from source: include params 41445 1727204186.41620: Evaluated conditional (state not in ["present", "absent"]): False 41445 1727204186.41622: when evaluation is False, skipping this task 41445 1727204186.41625: _execute() done 41445 1727204186.41628: dumping result to json 41445 1727204186.41638: done dumping result, returning 41445 1727204186.41644: done running TaskExecutor() for managed-node3/TASK: Ensure state in ["present", "absent"] [028d2410-947f-bf02-eee4-00000000026a] 41445 1727204186.41647: sending task result for task 028d2410-947f-bf02-eee4-00000000026a 41445 1727204186.41722: done sending task result for task 028d2410-947f-bf02-eee4-00000000026a 41445 1727204186.41724: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 41445 1727204186.41804: no more pending results, returning what we have 41445 1727204186.41807: results queue empty 41445 1727204186.41808: checking for any_errors_fatal 41445 1727204186.41811: done checking for any_errors_fatal 41445 1727204186.41812: checking for max_fail_percentage 41445 1727204186.41813: done checking for max_fail_percentage 41445 1727204186.41814: checking to see if all hosts have failed and the running result is not ok 41445 1727204186.41815: done checking to see if all hosts have failed 41445 1727204186.41815: getting the remaining hosts for this loop 41445 1727204186.41816: done getting the remaining hosts for this loop 41445 1727204186.41819: getting the next task for host managed-node3 41445 1727204186.41824: done getting next task for host managed-node3 41445 1727204186.41826: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 41445 1727204186.41829: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204186.41832: getting variables 41445 1727204186.41833: in VariableManager get_vars() 41445 1727204186.41872: Calling all_inventory to load vars for managed-node3 41445 1727204186.41876: Calling groups_inventory to load vars for managed-node3 41445 1727204186.41878: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204186.41887: Calling all_plugins_play to load vars for managed-node3 41445 1727204186.41889: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204186.41891: Calling groups_plugins_play to load vars for managed-node3 41445 1727204186.42029: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204186.42155: done with get_vars() 41445 1727204186.42162: done getting variables 41445 1727204186.42203: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Tuesday 24 September 2024 14:56:26 -0400 (0:00:00.018) 0:00:05.210 ***** 41445 1727204186.42225: entering _queue_task() for managed-node3/fail 41445 1727204186.42400: worker is 1 (out of 1 available) 41445 1727204186.42415: exiting _queue_task() for managed-node3/fail 41445 1727204186.42426: done queuing things up, now waiting for results queue to drain 41445 1727204186.42428: waiting for pending results... 41445 1727204186.42565: running TaskExecutor() for managed-node3/TASK: Ensure type in ["dummy", "tap", "veth"] 41445 1727204186.42629: in run() - task 028d2410-947f-bf02-eee4-00000000026b 41445 1727204186.42640: variable 'ansible_search_path' from source: unknown 41445 1727204186.42643: variable 'ansible_search_path' from source: unknown 41445 1727204186.42677: calling self._execute() 41445 1727204186.42739: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204186.42743: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204186.42751: variable 'omit' from source: magic vars 41445 1727204186.43032: variable 'ansible_distribution_major_version' from source: facts 41445 1727204186.43035: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204186.43155: variable 'type' from source: set_fact 41445 1727204186.43158: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 41445 1727204186.43165: when evaluation is False, skipping this task 41445 1727204186.43168: _execute() done 41445 1727204186.43170: dumping result to json 41445 1727204186.43173: done dumping result, returning 41445 1727204186.43184: done running TaskExecutor() for managed-node3/TASK: Ensure type in ["dummy", "tap", "veth"] [028d2410-947f-bf02-eee4-00000000026b] 41445 1727204186.43187: sending task result for task 028d2410-947f-bf02-eee4-00000000026b 41445 1727204186.43288: done sending task result for task 028d2410-947f-bf02-eee4-00000000026b 41445 1727204186.43291: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 41445 1727204186.43368: no more pending results, returning what we have 41445 1727204186.43375: results queue empty 41445 1727204186.43411: checking for any_errors_fatal 41445 1727204186.43416: done checking for any_errors_fatal 41445 1727204186.43417: checking for max_fail_percentage 41445 1727204186.43419: done checking for max_fail_percentage 41445 1727204186.43419: checking to see if all hosts have failed and the running result is not ok 41445 1727204186.43420: done checking to see if all hosts have failed 41445 1727204186.43421: getting the remaining hosts for this loop 41445 1727204186.43426: done getting the remaining hosts for this loop 41445 1727204186.43430: getting the next task for host managed-node3 41445 1727204186.43434: done getting next task for host managed-node3 41445 1727204186.43436: ^ task is: TASK: Include the task 'show_interfaces.yml' 41445 1727204186.43438: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204186.43440: getting variables 41445 1727204186.43441: in VariableManager get_vars() 41445 1727204186.43465: Calling all_inventory to load vars for managed-node3 41445 1727204186.43469: Calling groups_inventory to load vars for managed-node3 41445 1727204186.43472: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204186.43482: Calling all_plugins_play to load vars for managed-node3 41445 1727204186.43485: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204186.43488: Calling groups_plugins_play to load vars for managed-node3 41445 1727204186.43659: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204186.43892: done with get_vars() 41445 1727204186.43903: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Tuesday 24 September 2024 14:56:26 -0400 (0:00:00.017) 0:00:05.227 ***** 41445 1727204186.43992: entering _queue_task() for managed-node3/include_tasks 41445 1727204186.44183: worker is 1 (out of 1 available) 41445 1727204186.44197: exiting _queue_task() for managed-node3/include_tasks 41445 1727204186.44208: done queuing things up, now waiting for results queue to drain 41445 1727204186.44209: waiting for pending results... 41445 1727204186.44496: running TaskExecutor() for managed-node3/TASK: Include the task 'show_interfaces.yml' 41445 1727204186.44667: in run() - task 028d2410-947f-bf02-eee4-00000000026c 41445 1727204186.44982: variable 'ansible_search_path' from source: unknown 41445 1727204186.44987: variable 'ansible_search_path' from source: unknown 41445 1727204186.44990: calling self._execute() 41445 1727204186.44992: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204186.44994: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204186.44996: variable 'omit' from source: magic vars 41445 1727204186.45832: variable 'ansible_distribution_major_version' from source: facts 41445 1727204186.45913: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204186.46003: _execute() done 41445 1727204186.46014: dumping result to json 41445 1727204186.46021: done dumping result, returning 41445 1727204186.46031: done running TaskExecutor() for managed-node3/TASK: Include the task 'show_interfaces.yml' [028d2410-947f-bf02-eee4-00000000026c] 41445 1727204186.46040: sending task result for task 028d2410-947f-bf02-eee4-00000000026c 41445 1727204186.46208: no more pending results, returning what we have 41445 1727204186.46214: in VariableManager get_vars() 41445 1727204186.46261: Calling all_inventory to load vars for managed-node3 41445 1727204186.46264: Calling groups_inventory to load vars for managed-node3 41445 1727204186.46266: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204186.46283: Calling all_plugins_play to load vars for managed-node3 41445 1727204186.46286: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204186.46289: Calling groups_plugins_play to load vars for managed-node3 41445 1727204186.46579: done sending task result for task 028d2410-947f-bf02-eee4-00000000026c 41445 1727204186.46583: WORKER PROCESS EXITING 41445 1727204186.46604: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204186.46823: done with get_vars() 41445 1727204186.46831: variable 'ansible_search_path' from source: unknown 41445 1727204186.46832: variable 'ansible_search_path' from source: unknown 41445 1727204186.46873: we have included files to process 41445 1727204186.46874: generating all_blocks data 41445 1727204186.46879: done generating all_blocks data 41445 1727204186.46884: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 41445 1727204186.46885: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 41445 1727204186.46887: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 41445 1727204186.46994: in VariableManager get_vars() 41445 1727204186.47020: done with get_vars() 41445 1727204186.47137: done processing included file 41445 1727204186.47139: iterating over new_blocks loaded from include file 41445 1727204186.47140: in VariableManager get_vars() 41445 1727204186.47157: done with get_vars() 41445 1727204186.47158: filtering new block on tags 41445 1727204186.47185: done filtering new block on tags 41445 1727204186.47188: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node3 41445 1727204186.47192: extending task lists for all hosts with included blocks 41445 1727204186.47640: done extending task lists 41445 1727204186.47642: done processing included files 41445 1727204186.47643: results queue empty 41445 1727204186.47643: checking for any_errors_fatal 41445 1727204186.47647: done checking for any_errors_fatal 41445 1727204186.47648: checking for max_fail_percentage 41445 1727204186.47649: done checking for max_fail_percentage 41445 1727204186.47650: checking to see if all hosts have failed and the running result is not ok 41445 1727204186.47650: done checking to see if all hosts have failed 41445 1727204186.47652: getting the remaining hosts for this loop 41445 1727204186.47653: done getting the remaining hosts for this loop 41445 1727204186.47656: getting the next task for host managed-node3 41445 1727204186.47660: done getting next task for host managed-node3 41445 1727204186.47662: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 41445 1727204186.47665: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204186.47668: getting variables 41445 1727204186.47669: in VariableManager get_vars() 41445 1727204186.47713: Calling all_inventory to load vars for managed-node3 41445 1727204186.47716: Calling groups_inventory to load vars for managed-node3 41445 1727204186.47718: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204186.47732: Calling all_plugins_play to load vars for managed-node3 41445 1727204186.47735: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204186.47738: Calling groups_plugins_play to load vars for managed-node3 41445 1727204186.47893: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204186.48122: done with get_vars() 41445 1727204186.48131: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 14:56:26 -0400 (0:00:00.042) 0:00:05.269 ***** 41445 1727204186.48215: entering _queue_task() for managed-node3/include_tasks 41445 1727204186.48639: worker is 1 (out of 1 available) 41445 1727204186.48651: exiting _queue_task() for managed-node3/include_tasks 41445 1727204186.48664: done queuing things up, now waiting for results queue to drain 41445 1727204186.48665: waiting for pending results... 41445 1727204186.48950: running TaskExecutor() for managed-node3/TASK: Include the task 'get_current_interfaces.yml' 41445 1727204186.49085: in run() - task 028d2410-947f-bf02-eee4-000000000369 41445 1727204186.49105: variable 'ansible_search_path' from source: unknown 41445 1727204186.49116: variable 'ansible_search_path' from source: unknown 41445 1727204186.49151: calling self._execute() 41445 1727204186.49274: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204186.49280: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204186.49300: variable 'omit' from source: magic vars 41445 1727204186.49724: variable 'ansible_distribution_major_version' from source: facts 41445 1727204186.49737: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204186.49740: _execute() done 41445 1727204186.49743: dumping result to json 41445 1727204186.49746: done dumping result, returning 41445 1727204186.49749: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_current_interfaces.yml' [028d2410-947f-bf02-eee4-000000000369] 41445 1727204186.49755: sending task result for task 028d2410-947f-bf02-eee4-000000000369 41445 1727204186.49848: done sending task result for task 028d2410-947f-bf02-eee4-000000000369 41445 1727204186.49850: WORKER PROCESS EXITING 41445 1727204186.49898: no more pending results, returning what we have 41445 1727204186.49904: in VariableManager get_vars() 41445 1727204186.49964: Calling all_inventory to load vars for managed-node3 41445 1727204186.49966: Calling groups_inventory to load vars for managed-node3 41445 1727204186.49969: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204186.49980: Calling all_plugins_play to load vars for managed-node3 41445 1727204186.49982: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204186.49985: Calling groups_plugins_play to load vars for managed-node3 41445 1727204186.50118: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204186.50272: done with get_vars() 41445 1727204186.50283: variable 'ansible_search_path' from source: unknown 41445 1727204186.50284: variable 'ansible_search_path' from source: unknown 41445 1727204186.50340: we have included files to process 41445 1727204186.50341: generating all_blocks data 41445 1727204186.50346: done generating all_blocks data 41445 1727204186.50347: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 41445 1727204186.50348: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 41445 1727204186.50351: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 41445 1727204186.50623: done processing included file 41445 1727204186.50626: iterating over new_blocks loaded from include file 41445 1727204186.50627: in VariableManager get_vars() 41445 1727204186.50645: done with get_vars() 41445 1727204186.50646: filtering new block on tags 41445 1727204186.50661: done filtering new block on tags 41445 1727204186.50664: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node3 41445 1727204186.50668: extending task lists for all hosts with included blocks 41445 1727204186.50812: done extending task lists 41445 1727204186.50814: done processing included files 41445 1727204186.50815: results queue empty 41445 1727204186.50816: checking for any_errors_fatal 41445 1727204186.50818: done checking for any_errors_fatal 41445 1727204186.50819: checking for max_fail_percentage 41445 1727204186.50820: done checking for max_fail_percentage 41445 1727204186.50821: checking to see if all hosts have failed and the running result is not ok 41445 1727204186.50821: done checking to see if all hosts have failed 41445 1727204186.50822: getting the remaining hosts for this loop 41445 1727204186.50823: done getting the remaining hosts for this loop 41445 1727204186.50826: getting the next task for host managed-node3 41445 1727204186.50830: done getting next task for host managed-node3 41445 1727204186.50832: ^ task is: TASK: Gather current interface info 41445 1727204186.50835: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204186.50838: getting variables 41445 1727204186.50839: in VariableManager get_vars() 41445 1727204186.50852: Calling all_inventory to load vars for managed-node3 41445 1727204186.50854: Calling groups_inventory to load vars for managed-node3 41445 1727204186.50856: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204186.50861: Calling all_plugins_play to load vars for managed-node3 41445 1727204186.50864: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204186.50867: Calling groups_plugins_play to load vars for managed-node3 41445 1727204186.51017: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204186.51218: done with get_vars() 41445 1727204186.51227: done getting variables 41445 1727204186.51263: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 14:56:26 -0400 (0:00:00.030) 0:00:05.300 ***** 41445 1727204186.51293: entering _queue_task() for managed-node3/command 41445 1727204186.51559: worker is 1 (out of 1 available) 41445 1727204186.51572: exiting _queue_task() for managed-node3/command 41445 1727204186.51586: done queuing things up, now waiting for results queue to drain 41445 1727204186.51588: waiting for pending results... 41445 1727204186.51758: running TaskExecutor() for managed-node3/TASK: Gather current interface info 41445 1727204186.51828: in run() - task 028d2410-947f-bf02-eee4-0000000003a0 41445 1727204186.51839: variable 'ansible_search_path' from source: unknown 41445 1727204186.51843: variable 'ansible_search_path' from source: unknown 41445 1727204186.51871: calling self._execute() 41445 1727204186.51945: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204186.51948: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204186.51957: variable 'omit' from source: magic vars 41445 1727204186.52267: variable 'ansible_distribution_major_version' from source: facts 41445 1727204186.52279: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204186.52285: variable 'omit' from source: magic vars 41445 1727204186.52324: variable 'omit' from source: magic vars 41445 1727204186.52348: variable 'omit' from source: magic vars 41445 1727204186.52379: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41445 1727204186.52408: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41445 1727204186.52428: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41445 1727204186.52438: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204186.52447: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204186.52541: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41445 1727204186.52545: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204186.52548: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204186.52596: Set connection var ansible_shell_executable to /bin/sh 41445 1727204186.52600: Set connection var ansible_shell_type to sh 41445 1727204186.52602: Set connection var ansible_pipelining to False 41445 1727204186.52605: Set connection var ansible_timeout to 10 41445 1727204186.52607: Set connection var ansible_connection to ssh 41445 1727204186.52612: Set connection var ansible_module_compression to ZIP_DEFLATED 41445 1727204186.52764: variable 'ansible_shell_executable' from source: unknown 41445 1727204186.52767: variable 'ansible_connection' from source: unknown 41445 1727204186.52770: variable 'ansible_module_compression' from source: unknown 41445 1727204186.52773: variable 'ansible_shell_type' from source: unknown 41445 1727204186.52777: variable 'ansible_shell_executable' from source: unknown 41445 1727204186.52781: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204186.52783: variable 'ansible_pipelining' from source: unknown 41445 1727204186.52785: variable 'ansible_timeout' from source: unknown 41445 1727204186.52787: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204186.52808: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41445 1727204186.52816: variable 'omit' from source: magic vars 41445 1727204186.52822: starting attempt loop 41445 1727204186.52825: running the handler 41445 1727204186.52841: _low_level_execute_command(): starting 41445 1727204186.52912: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41445 1727204186.53653: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204186.53671: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found <<< 41445 1727204186.53680: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204186.53762: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204186.53783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204186.53798: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204186.53800: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204186.53883: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204186.53934: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204186.55467: stdout chunk (state=3): >>>/root <<< 41445 1727204186.55557: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204186.55683: stderr chunk (state=3): >>><<< 41445 1727204186.55686: stdout chunk (state=3): >>><<< 41445 1727204186.55689: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204186.55707: _low_level_execute_command(): starting 41445 1727204186.55723: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204186.5569408-42039-65350392782741 `" && echo ansible-tmp-1727204186.5569408-42039-65350392782741="` echo /root/.ansible/tmp/ansible-tmp-1727204186.5569408-42039-65350392782741 `" ) && sleep 0' 41445 1727204186.56893: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204186.57041: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204186.57194: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204186.58915: stdout chunk (state=3): >>>ansible-tmp-1727204186.5569408-42039-65350392782741=/root/.ansible/tmp/ansible-tmp-1727204186.5569408-42039-65350392782741 <<< 41445 1727204186.59026: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204186.59060: stderr chunk (state=3): >>><<< 41445 1727204186.59068: stdout chunk (state=3): >>><<< 41445 1727204186.59116: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204186.5569408-42039-65350392782741=/root/.ansible/tmp/ansible-tmp-1727204186.5569408-42039-65350392782741 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204186.59481: variable 'ansible_module_compression' from source: unknown 41445 1727204186.59486: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-414450s0ylvj1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 41445 1727204186.59488: variable 'ansible_facts' from source: unknown 41445 1727204186.59650: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204186.5569408-42039-65350392782741/AnsiballZ_command.py 41445 1727204186.59841: Sending initial data 41445 1727204186.59942: Sent initial data (155 bytes) 41445 1727204186.60982: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204186.60999: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204186.61013: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204186.61058: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204186.61268: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204186.61288: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204186.62799: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41445 1727204186.62827: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41445 1727204186.62859: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-414450s0ylvj1/tmpnj6dj3nz /root/.ansible/tmp/ansible-tmp-1727204186.5569408-42039-65350392782741/AnsiballZ_command.py <<< 41445 1727204186.63019: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204186.5569408-42039-65350392782741/AnsiballZ_command.py" debug1: stat remote: No such file or directory <<< 41445 1727204186.63023: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-414450s0ylvj1/tmpnj6dj3nz" to remote "/root/.ansible/tmp/ansible-tmp-1727204186.5569408-42039-65350392782741/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204186.5569408-42039-65350392782741/AnsiballZ_command.py" <<< 41445 1727204186.64300: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204186.64429: stderr chunk (state=3): >>><<< 41445 1727204186.64433: stdout chunk (state=3): >>><<< 41445 1727204186.64492: done transferring module to remote 41445 1727204186.64502: _low_level_execute_command(): starting 41445 1727204186.64508: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204186.5569408-42039-65350392782741/ /root/.ansible/tmp/ansible-tmp-1727204186.5569408-42039-65350392782741/AnsiballZ_command.py && sleep 0' 41445 1727204186.65697: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204186.65869: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204186.65873: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204186.65907: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204186.65913: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204186.67682: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204186.67686: stdout chunk (state=3): >>><<< 41445 1727204186.67689: stderr chunk (state=3): >>><<< 41445 1727204186.67791: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204186.67799: _low_level_execute_command(): starting 41445 1727204186.67803: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204186.5569408-42039-65350392782741/AnsiballZ_command.py && sleep 0' 41445 1727204186.69347: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204186.69352: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204186.69354: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204186.69356: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204186.69358: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 <<< 41445 1727204186.69360: stderr chunk (state=3): >>>debug2: match not found <<< 41445 1727204186.69362: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204186.69364: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204186.69366: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204186.69441: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204186.84574: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 14:56:26.841105", "end": "2024-09-24 14:56:26.844164", "delta": "0:00:00.003059", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 41445 1727204186.86099: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. <<< 41445 1727204186.86102: stdout chunk (state=3): >>><<< 41445 1727204186.86104: stderr chunk (state=3): >>><<< 41445 1727204186.86217: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 14:56:26.841105", "end": "2024-09-24 14:56:26.844164", "delta": "0:00:00.003059", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. 41445 1727204186.86220: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204186.5569408-42039-65350392782741/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41445 1727204186.86222: _low_level_execute_command(): starting 41445 1727204186.86224: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204186.5569408-42039-65350392782741/ > /dev/null 2>&1 && sleep 0' 41445 1727204186.87565: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204186.87686: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204186.87739: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204186.87852: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204186.89692: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204186.89742: stderr chunk (state=3): >>><<< 41445 1727204186.89752: stdout chunk (state=3): >>><<< 41445 1727204186.89782: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204186.89813: handler run complete 41445 1727204186.89901: Evaluated conditional (False): False 41445 1727204186.89905: attempt loop complete, returning result 41445 1727204186.89907: _execute() done 41445 1727204186.89909: dumping result to json 41445 1727204186.89911: done dumping result, returning 41445 1727204186.89913: done running TaskExecutor() for managed-node3/TASK: Gather current interface info [028d2410-947f-bf02-eee4-0000000003a0] 41445 1727204186.89915: sending task result for task 028d2410-947f-bf02-eee4-0000000003a0 ok: [managed-node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003059", "end": "2024-09-24 14:56:26.844164", "rc": 0, "start": "2024-09-24 14:56:26.841105" } STDOUT: bonding_masters eth0 lo rpltstbr 41445 1727204186.90126: no more pending results, returning what we have 41445 1727204186.90130: results queue empty 41445 1727204186.90130: checking for any_errors_fatal 41445 1727204186.90132: done checking for any_errors_fatal 41445 1727204186.90133: checking for max_fail_percentage 41445 1727204186.90134: done checking for max_fail_percentage 41445 1727204186.90135: checking to see if all hosts have failed and the running result is not ok 41445 1727204186.90136: done checking to see if all hosts have failed 41445 1727204186.90137: getting the remaining hosts for this loop 41445 1727204186.90138: done getting the remaining hosts for this loop 41445 1727204186.90142: getting the next task for host managed-node3 41445 1727204186.90150: done getting next task for host managed-node3 41445 1727204186.90267: ^ task is: TASK: Set current_interfaces 41445 1727204186.90273: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204186.90280: getting variables 41445 1727204186.90282: in VariableManager get_vars() 41445 1727204186.90484: Calling all_inventory to load vars for managed-node3 41445 1727204186.90490: Calling groups_inventory to load vars for managed-node3 41445 1727204186.90493: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204186.90499: done sending task result for task 028d2410-947f-bf02-eee4-0000000003a0 41445 1727204186.90503: WORKER PROCESS EXITING 41445 1727204186.90514: Calling all_plugins_play to load vars for managed-node3 41445 1727204186.90517: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204186.90521: Calling groups_plugins_play to load vars for managed-node3 41445 1727204186.90829: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204186.91053: done with get_vars() 41445 1727204186.91064: done getting variables 41445 1727204186.91133: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 14:56:26 -0400 (0:00:00.398) 0:00:05.699 ***** 41445 1727204186.91166: entering _queue_task() for managed-node3/set_fact 41445 1727204186.91641: worker is 1 (out of 1 available) 41445 1727204186.91653: exiting _queue_task() for managed-node3/set_fact 41445 1727204186.91663: done queuing things up, now waiting for results queue to drain 41445 1727204186.91664: waiting for pending results... 41445 1727204186.91979: running TaskExecutor() for managed-node3/TASK: Set current_interfaces 41445 1727204186.92408: in run() - task 028d2410-947f-bf02-eee4-0000000003a1 41445 1727204186.92412: variable 'ansible_search_path' from source: unknown 41445 1727204186.92415: variable 'ansible_search_path' from source: unknown 41445 1727204186.92419: calling self._execute() 41445 1727204186.92531: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204186.92682: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204186.92685: variable 'omit' from source: magic vars 41445 1727204186.93152: variable 'ansible_distribution_major_version' from source: facts 41445 1727204186.93173: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204186.93189: variable 'omit' from source: magic vars 41445 1727204186.93251: variable 'omit' from source: magic vars 41445 1727204186.93383: variable '_current_interfaces' from source: set_fact 41445 1727204186.93460: variable 'omit' from source: magic vars 41445 1727204186.93603: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41445 1727204186.93666: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41445 1727204186.93795: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41445 1727204186.93823: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204186.93840: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204186.94061: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41445 1727204186.94065: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204186.94067: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204186.94138: Set connection var ansible_shell_executable to /bin/sh 41445 1727204186.94147: Set connection var ansible_shell_type to sh 41445 1727204186.94158: Set connection var ansible_pipelining to False 41445 1727204186.94171: Set connection var ansible_timeout to 10 41445 1727204186.94182: Set connection var ansible_connection to ssh 41445 1727204186.94196: Set connection var ansible_module_compression to ZIP_DEFLATED 41445 1727204186.94234: variable 'ansible_shell_executable' from source: unknown 41445 1727204186.94280: variable 'ansible_connection' from source: unknown 41445 1727204186.94284: variable 'ansible_module_compression' from source: unknown 41445 1727204186.94286: variable 'ansible_shell_type' from source: unknown 41445 1727204186.94288: variable 'ansible_shell_executable' from source: unknown 41445 1727204186.94290: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204186.94292: variable 'ansible_pipelining' from source: unknown 41445 1727204186.94294: variable 'ansible_timeout' from source: unknown 41445 1727204186.94296: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204186.94446: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41445 1727204186.94481: variable 'omit' from source: magic vars 41445 1727204186.94484: starting attempt loop 41445 1727204186.94487: running the handler 41445 1727204186.94568: handler run complete 41445 1727204186.94572: attempt loop complete, returning result 41445 1727204186.94574: _execute() done 41445 1727204186.94580: dumping result to json 41445 1727204186.94582: done dumping result, returning 41445 1727204186.94584: done running TaskExecutor() for managed-node3/TASK: Set current_interfaces [028d2410-947f-bf02-eee4-0000000003a1] 41445 1727204186.94586: sending task result for task 028d2410-947f-bf02-eee4-0000000003a1 ok: [managed-node3] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo", "rpltstbr" ] }, "changed": false } 41445 1727204186.94739: no more pending results, returning what we have 41445 1727204186.94743: results queue empty 41445 1727204186.94744: checking for any_errors_fatal 41445 1727204186.94753: done checking for any_errors_fatal 41445 1727204186.94753: checking for max_fail_percentage 41445 1727204186.94755: done checking for max_fail_percentage 41445 1727204186.94756: checking to see if all hosts have failed and the running result is not ok 41445 1727204186.94757: done checking to see if all hosts have failed 41445 1727204186.94758: getting the remaining hosts for this loop 41445 1727204186.94759: done getting the remaining hosts for this loop 41445 1727204186.94763: getting the next task for host managed-node3 41445 1727204186.94773: done getting next task for host managed-node3 41445 1727204186.94777: ^ task is: TASK: Show current_interfaces 41445 1727204186.94782: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204186.94786: getting variables 41445 1727204186.94788: in VariableManager get_vars() 41445 1727204186.94835: Calling all_inventory to load vars for managed-node3 41445 1727204186.94839: Calling groups_inventory to load vars for managed-node3 41445 1727204186.94841: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204186.94984: done sending task result for task 028d2410-947f-bf02-eee4-0000000003a1 41445 1727204186.94988: WORKER PROCESS EXITING 41445 1727204186.95000: Calling all_plugins_play to load vars for managed-node3 41445 1727204186.95003: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204186.95007: Calling groups_plugins_play to load vars for managed-node3 41445 1727204186.95457: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204186.96025: done with get_vars() 41445 1727204186.96034: done getting variables 41445 1727204186.96089: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 14:56:26 -0400 (0:00:00.049) 0:00:05.748 ***** 41445 1727204186.96120: entering _queue_task() for managed-node3/debug 41445 1727204186.96373: worker is 1 (out of 1 available) 41445 1727204186.96390: exiting _queue_task() for managed-node3/debug 41445 1727204186.96403: done queuing things up, now waiting for results queue to drain 41445 1727204186.96404: waiting for pending results... 41445 1727204186.97086: running TaskExecutor() for managed-node3/TASK: Show current_interfaces 41445 1727204186.97361: in run() - task 028d2410-947f-bf02-eee4-00000000036a 41445 1727204186.97402: variable 'ansible_search_path' from source: unknown 41445 1727204186.97610: variable 'ansible_search_path' from source: unknown 41445 1727204186.97615: calling self._execute() 41445 1727204186.97700: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204186.97715: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204186.97735: variable 'omit' from source: magic vars 41445 1727204186.98681: variable 'ansible_distribution_major_version' from source: facts 41445 1727204186.98685: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204186.98687: variable 'omit' from source: magic vars 41445 1727204186.98690: variable 'omit' from source: magic vars 41445 1727204186.98837: variable 'current_interfaces' from source: set_fact 41445 1727204186.98868: variable 'omit' from source: magic vars 41445 1727204186.98954: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41445 1727204186.98997: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41445 1727204186.99028: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41445 1727204186.99051: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204186.99087: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204186.99122: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41445 1727204186.99131: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204186.99143: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204186.99253: Set connection var ansible_shell_executable to /bin/sh 41445 1727204186.99263: Set connection var ansible_shell_type to sh 41445 1727204186.99273: Set connection var ansible_pipelining to False 41445 1727204186.99287: Set connection var ansible_timeout to 10 41445 1727204186.99358: Set connection var ansible_connection to ssh 41445 1727204186.99361: Set connection var ansible_module_compression to ZIP_DEFLATED 41445 1727204186.99364: variable 'ansible_shell_executable' from source: unknown 41445 1727204186.99368: variable 'ansible_connection' from source: unknown 41445 1727204186.99371: variable 'ansible_module_compression' from source: unknown 41445 1727204186.99374: variable 'ansible_shell_type' from source: unknown 41445 1727204186.99379: variable 'ansible_shell_executable' from source: unknown 41445 1727204186.99382: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204186.99384: variable 'ansible_pipelining' from source: unknown 41445 1727204186.99387: variable 'ansible_timeout' from source: unknown 41445 1727204186.99390: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204186.99534: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41445 1727204186.99551: variable 'omit' from source: magic vars 41445 1727204186.99560: starting attempt loop 41445 1727204186.99567: running the handler 41445 1727204186.99622: handler run complete 41445 1727204186.99683: attempt loop complete, returning result 41445 1727204186.99688: _execute() done 41445 1727204186.99690: dumping result to json 41445 1727204186.99692: done dumping result, returning 41445 1727204186.99695: done running TaskExecutor() for managed-node3/TASK: Show current_interfaces [028d2410-947f-bf02-eee4-00000000036a] 41445 1727204186.99697: sending task result for task 028d2410-947f-bf02-eee4-00000000036a ok: [managed-node3] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo', 'rpltstbr'] 41445 1727204186.99828: no more pending results, returning what we have 41445 1727204186.99831: results queue empty 41445 1727204186.99832: checking for any_errors_fatal 41445 1727204186.99839: done checking for any_errors_fatal 41445 1727204186.99840: checking for max_fail_percentage 41445 1727204186.99841: done checking for max_fail_percentage 41445 1727204186.99842: checking to see if all hosts have failed and the running result is not ok 41445 1727204186.99843: done checking to see if all hosts have failed 41445 1727204186.99843: getting the remaining hosts for this loop 41445 1727204186.99844: done getting the remaining hosts for this loop 41445 1727204186.99848: getting the next task for host managed-node3 41445 1727204186.99856: done getting next task for host managed-node3 41445 1727204186.99858: ^ task is: TASK: Install iproute 41445 1727204186.99861: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204186.99864: getting variables 41445 1727204186.99866: in VariableManager get_vars() 41445 1727204186.99912: Calling all_inventory to load vars for managed-node3 41445 1727204186.99915: Calling groups_inventory to load vars for managed-node3 41445 1727204186.99918: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204186.99929: Calling all_plugins_play to load vars for managed-node3 41445 1727204186.99932: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204186.99935: Calling groups_plugins_play to load vars for managed-node3 41445 1727204187.00166: done sending task result for task 028d2410-947f-bf02-eee4-00000000036a 41445 1727204187.00169: WORKER PROCESS EXITING 41445 1727204187.00191: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204187.00522: done with get_vars() 41445 1727204187.00715: done getting variables 41445 1727204187.00981: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Tuesday 24 September 2024 14:56:27 -0400 (0:00:00.048) 0:00:05.797 ***** 41445 1727204187.01014: entering _queue_task() for managed-node3/package 41445 1727204187.01309: worker is 1 (out of 1 available) 41445 1727204187.01320: exiting _queue_task() for managed-node3/package 41445 1727204187.01332: done queuing things up, now waiting for results queue to drain 41445 1727204187.01334: waiting for pending results... 41445 1727204187.01697: running TaskExecutor() for managed-node3/TASK: Install iproute 41445 1727204187.01733: in run() - task 028d2410-947f-bf02-eee4-00000000026d 41445 1727204187.01752: variable 'ansible_search_path' from source: unknown 41445 1727204187.01759: variable 'ansible_search_path' from source: unknown 41445 1727204187.01802: calling self._execute() 41445 1727204187.01904: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204187.01917: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204187.01933: variable 'omit' from source: magic vars 41445 1727204187.02310: variable 'ansible_distribution_major_version' from source: facts 41445 1727204187.02337: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204187.02352: variable 'omit' from source: magic vars 41445 1727204187.02393: variable 'omit' from source: magic vars 41445 1727204187.02593: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41445 1727204187.04889: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41445 1727204187.04963: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41445 1727204187.05005: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41445 1727204187.05048: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41445 1727204187.05166: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41445 1727204187.05191: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204187.05243: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204187.05383: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204187.05388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204187.05391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204187.05543: variable '__network_is_ostree' from source: set_fact 41445 1727204187.05580: variable 'omit' from source: magic vars 41445 1727204187.05730: variable 'omit' from source: magic vars 41445 1727204187.05769: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41445 1727204187.05804: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41445 1727204187.05829: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41445 1727204187.05855: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204187.05872: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204187.05910: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41445 1727204187.05920: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204187.05945: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204187.06035: Set connection var ansible_shell_executable to /bin/sh 41445 1727204187.06044: Set connection var ansible_shell_type to sh 41445 1727204187.06082: Set connection var ansible_pipelining to False 41445 1727204187.06086: Set connection var ansible_timeout to 10 41445 1727204187.06089: Set connection var ansible_connection to ssh 41445 1727204187.06091: Set connection var ansible_module_compression to ZIP_DEFLATED 41445 1727204187.06120: variable 'ansible_shell_executable' from source: unknown 41445 1727204187.06128: variable 'ansible_connection' from source: unknown 41445 1727204187.06162: variable 'ansible_module_compression' from source: unknown 41445 1727204187.06166: variable 'ansible_shell_type' from source: unknown 41445 1727204187.06168: variable 'ansible_shell_executable' from source: unknown 41445 1727204187.06170: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204187.06172: variable 'ansible_pipelining' from source: unknown 41445 1727204187.06174: variable 'ansible_timeout' from source: unknown 41445 1727204187.06178: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204187.06280: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41445 1727204187.06290: variable 'omit' from source: magic vars 41445 1727204187.06382: starting attempt loop 41445 1727204187.06385: running the handler 41445 1727204187.06387: variable 'ansible_facts' from source: unknown 41445 1727204187.06389: variable 'ansible_facts' from source: unknown 41445 1727204187.06392: _low_level_execute_command(): starting 41445 1727204187.06394: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41445 1727204187.07082: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204187.07143: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204187.07170: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204187.07196: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204187.07291: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204187.08959: stdout chunk (state=3): >>>/root <<< 41445 1727204187.09125: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204187.09130: stdout chunk (state=3): >>><<< 41445 1727204187.09132: stderr chunk (state=3): >>><<< 41445 1727204187.09153: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204187.09265: _low_level_execute_command(): starting 41445 1727204187.09269: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204187.0916655-42121-131936977671759 `" && echo ansible-tmp-1727204187.0916655-42121-131936977671759="` echo /root/.ansible/tmp/ansible-tmp-1727204187.0916655-42121-131936977671759 `" ) && sleep 0' 41445 1727204187.10045: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204187.10091: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204187.10167: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204187.12003: stdout chunk (state=3): >>>ansible-tmp-1727204187.0916655-42121-131936977671759=/root/.ansible/tmp/ansible-tmp-1727204187.0916655-42121-131936977671759 <<< 41445 1727204187.12160: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204187.12164: stdout chunk (state=3): >>><<< 41445 1727204187.12167: stderr chunk (state=3): >>><<< 41445 1727204187.12214: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204187.0916655-42121-131936977671759=/root/.ansible/tmp/ansible-tmp-1727204187.0916655-42121-131936977671759 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204187.12240: variable 'ansible_module_compression' from source: unknown 41445 1727204187.12284: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 41445 1727204187.12296: ANSIBALLZ: Acquiring lock 41445 1727204187.12299: ANSIBALLZ: Lock acquired: 140182283768784 41445 1727204187.12302: ANSIBALLZ: Creating module 41445 1727204187.23613: ANSIBALLZ: Writing module into payload 41445 1727204187.23744: ANSIBALLZ: Writing module 41445 1727204187.23763: ANSIBALLZ: Renaming module 41445 1727204187.23774: ANSIBALLZ: Done creating module 41445 1727204187.23794: variable 'ansible_facts' from source: unknown 41445 1727204187.23849: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204187.0916655-42121-131936977671759/AnsiballZ_dnf.py 41445 1727204187.23951: Sending initial data 41445 1727204187.23954: Sent initial data (152 bytes) 41445 1727204187.24395: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204187.24399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204187.24415: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204187.24470: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204187.24473: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204187.24479: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204187.24521: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204187.26124: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 41445 1727204187.26130: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41445 1727204187.26165: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41445 1727204187.26195: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-414450s0ylvj1/tmpzm_a6f85 /root/.ansible/tmp/ansible-tmp-1727204187.0916655-42121-131936977671759/AnsiballZ_dnf.py <<< 41445 1727204187.26202: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204187.0916655-42121-131936977671759/AnsiballZ_dnf.py" <<< 41445 1727204187.26232: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-414450s0ylvj1/tmpzm_a6f85" to remote "/root/.ansible/tmp/ansible-tmp-1727204187.0916655-42121-131936977671759/AnsiballZ_dnf.py" <<< 41445 1727204187.26234: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204187.0916655-42121-131936977671759/AnsiballZ_dnf.py" <<< 41445 1727204187.26858: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204187.26903: stderr chunk (state=3): >>><<< 41445 1727204187.26906: stdout chunk (state=3): >>><<< 41445 1727204187.26938: done transferring module to remote 41445 1727204187.26947: _low_level_execute_command(): starting 41445 1727204187.26952: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204187.0916655-42121-131936977671759/ /root/.ansible/tmp/ansible-tmp-1727204187.0916655-42121-131936977671759/AnsiballZ_dnf.py && sleep 0' 41445 1727204187.27373: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204187.27379: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 <<< 41445 1727204187.27409: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204187.27412: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address <<< 41445 1727204187.27414: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204187.27416: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204187.27470: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204187.27473: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204187.27493: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204187.27517: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204187.29227: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204187.29252: stderr chunk (state=3): >>><<< 41445 1727204187.29255: stdout chunk (state=3): >>><<< 41445 1727204187.29268: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204187.29271: _low_level_execute_command(): starting 41445 1727204187.29278: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204187.0916655-42121-131936977671759/AnsiballZ_dnf.py && sleep 0' 41445 1727204187.29708: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204187.29712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found <<< 41445 1727204187.29714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204187.29716: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204187.29718: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204187.29773: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204187.29782: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204187.29784: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204187.29815: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204187.70162: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 41445 1727204187.75981: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. <<< 41445 1727204187.75987: stdout chunk (state=3): >>><<< 41445 1727204187.75990: stderr chunk (state=3): >>><<< 41445 1727204187.75992: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. 41445 1727204187.76001: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204187.0916655-42121-131936977671759/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41445 1727204187.76024: _low_level_execute_command(): starting 41445 1727204187.76035: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204187.0916655-42121-131936977671759/ > /dev/null 2>&1 && sleep 0' 41445 1727204187.76799: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204187.76859: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204187.76895: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204187.76936: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204187.76977: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204187.79282: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204187.79287: stdout chunk (state=3): >>><<< 41445 1727204187.79289: stderr chunk (state=3): >>><<< 41445 1727204187.79292: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204187.79294: handler run complete 41445 1727204187.79296: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41445 1727204187.79651: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41445 1727204187.80082: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41445 1727204187.80086: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41445 1727204187.80088: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41445 1727204187.80090: variable '__install_status' from source: unknown 41445 1727204187.80092: Evaluated conditional (__install_status is success): True 41445 1727204187.80107: attempt loop complete, returning result 41445 1727204187.80114: _execute() done 41445 1727204187.80129: dumping result to json 41445 1727204187.80582: done dumping result, returning 41445 1727204187.80585: done running TaskExecutor() for managed-node3/TASK: Install iproute [028d2410-947f-bf02-eee4-00000000026d] 41445 1727204187.80589: sending task result for task 028d2410-947f-bf02-eee4-00000000026d 41445 1727204187.80664: done sending task result for task 028d2410-947f-bf02-eee4-00000000026d 41445 1727204187.80668: WORKER PROCESS EXITING ok: [managed-node3] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 41445 1727204187.80810: no more pending results, returning what we have 41445 1727204187.80814: results queue empty 41445 1727204187.80815: checking for any_errors_fatal 41445 1727204187.80819: done checking for any_errors_fatal 41445 1727204187.80820: checking for max_fail_percentage 41445 1727204187.80822: done checking for max_fail_percentage 41445 1727204187.80822: checking to see if all hosts have failed and the running result is not ok 41445 1727204187.80823: done checking to see if all hosts have failed 41445 1727204187.80824: getting the remaining hosts for this loop 41445 1727204187.80825: done getting the remaining hosts for this loop 41445 1727204187.80828: getting the next task for host managed-node3 41445 1727204187.80833: done getting next task for host managed-node3 41445 1727204187.80835: ^ task is: TASK: Create veth interface {{ interface }} 41445 1727204187.80838: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204187.80841: getting variables 41445 1727204187.80842: in VariableManager get_vars() 41445 1727204187.81085: Calling all_inventory to load vars for managed-node3 41445 1727204187.81092: Calling groups_inventory to load vars for managed-node3 41445 1727204187.81095: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204187.81105: Calling all_plugins_play to load vars for managed-node3 41445 1727204187.81107: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204187.81112: Calling groups_plugins_play to load vars for managed-node3 41445 1727204187.81414: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204187.81891: done with get_vars() 41445 1727204187.81905: done getting variables 41445 1727204187.81970: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 41445 1727204187.82263: variable 'interface' from source: set_fact TASK [Create veth interface ethtest0] ****************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Tuesday 24 September 2024 14:56:27 -0400 (0:00:00.812) 0:00:06.610 ***** 41445 1727204187.82302: entering _queue_task() for managed-node3/command 41445 1727204187.82630: worker is 1 (out of 1 available) 41445 1727204187.82762: exiting _queue_task() for managed-node3/command 41445 1727204187.82779: done queuing things up, now waiting for results queue to drain 41445 1727204187.82780: waiting for pending results... 41445 1727204187.82953: running TaskExecutor() for managed-node3/TASK: Create veth interface ethtest0 41445 1727204187.83057: in run() - task 028d2410-947f-bf02-eee4-00000000026e 41445 1727204187.83070: variable 'ansible_search_path' from source: unknown 41445 1727204187.83076: variable 'ansible_search_path' from source: unknown 41445 1727204187.83362: variable 'interface' from source: set_fact 41445 1727204187.83454: variable 'interface' from source: set_fact 41445 1727204187.83533: variable 'interface' from source: set_fact 41445 1727204187.83700: Loaded config def from plugin (lookup/items) 41445 1727204187.83711: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 41445 1727204187.83730: variable 'omit' from source: magic vars 41445 1727204187.83868: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204187.83881: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204187.83898: variable 'omit' from source: magic vars 41445 1727204187.84146: variable 'ansible_distribution_major_version' from source: facts 41445 1727204187.84154: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204187.84366: variable 'type' from source: set_fact 41445 1727204187.84370: variable 'state' from source: include params 41445 1727204187.84374: variable 'interface' from source: set_fact 41445 1727204187.84384: variable 'current_interfaces' from source: set_fact 41445 1727204187.84396: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 41445 1727204187.84402: variable 'omit' from source: magic vars 41445 1727204187.84436: variable 'omit' from source: magic vars 41445 1727204187.84482: variable 'item' from source: unknown 41445 1727204187.84554: variable 'item' from source: unknown 41445 1727204187.84573: variable 'omit' from source: magic vars 41445 1727204187.84613: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41445 1727204187.84641: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41445 1727204187.84659: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41445 1727204187.84681: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204187.84692: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204187.84725: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41445 1727204187.84729: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204187.84731: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204187.84836: Set connection var ansible_shell_executable to /bin/sh 41445 1727204187.84839: Set connection var ansible_shell_type to sh 41445 1727204187.84844: Set connection var ansible_pipelining to False 41445 1727204187.84852: Set connection var ansible_timeout to 10 41445 1727204187.84855: Set connection var ansible_connection to ssh 41445 1727204187.84955: Set connection var ansible_module_compression to ZIP_DEFLATED 41445 1727204187.84958: variable 'ansible_shell_executable' from source: unknown 41445 1727204187.84960: variable 'ansible_connection' from source: unknown 41445 1727204187.84962: variable 'ansible_module_compression' from source: unknown 41445 1727204187.84964: variable 'ansible_shell_type' from source: unknown 41445 1727204187.84967: variable 'ansible_shell_executable' from source: unknown 41445 1727204187.84968: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204187.84970: variable 'ansible_pipelining' from source: unknown 41445 1727204187.84972: variable 'ansible_timeout' from source: unknown 41445 1727204187.84974: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204187.85052: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41445 1727204187.85061: variable 'omit' from source: magic vars 41445 1727204187.85064: starting attempt loop 41445 1727204187.85066: running the handler 41445 1727204187.85084: _low_level_execute_command(): starting 41445 1727204187.85092: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41445 1727204187.86004: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204187.86029: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204187.86047: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204187.86051: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204187.86112: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204187.87670: stdout chunk (state=3): >>>/root <<< 41445 1727204187.87802: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204187.87805: stdout chunk (state=3): >>><<< 41445 1727204187.87806: stderr chunk (state=3): >>><<< 41445 1727204187.87826: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204187.87842: _low_level_execute_command(): starting 41445 1727204187.87847: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204187.8783002-42181-136066352137810 `" && echo ansible-tmp-1727204187.8783002-42181-136066352137810="` echo /root/.ansible/tmp/ansible-tmp-1727204187.8783002-42181-136066352137810 `" ) && sleep 0' 41445 1727204187.88394: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204187.88398: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204187.88401: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204187.88404: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204187.88453: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204187.88486: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204187.90309: stdout chunk (state=3): >>>ansible-tmp-1727204187.8783002-42181-136066352137810=/root/.ansible/tmp/ansible-tmp-1727204187.8783002-42181-136066352137810 <<< 41445 1727204187.90417: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204187.90444: stderr chunk (state=3): >>><<< 41445 1727204187.90448: stdout chunk (state=3): >>><<< 41445 1727204187.90465: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204187.8783002-42181-136066352137810=/root/.ansible/tmp/ansible-tmp-1727204187.8783002-42181-136066352137810 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204187.90493: variable 'ansible_module_compression' from source: unknown 41445 1727204187.90539: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-414450s0ylvj1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 41445 1727204187.90571: variable 'ansible_facts' from source: unknown 41445 1727204187.90629: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204187.8783002-42181-136066352137810/AnsiballZ_command.py 41445 1727204187.90734: Sending initial data 41445 1727204187.90738: Sent initial data (156 bytes) 41445 1727204187.91340: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204187.91360: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204187.91420: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204187.92997: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41445 1727204187.93035: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41445 1727204187.93078: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-414450s0ylvj1/tmpf9n44gbv /root/.ansible/tmp/ansible-tmp-1727204187.8783002-42181-136066352137810/AnsiballZ_command.py <<< 41445 1727204187.93081: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204187.8783002-42181-136066352137810/AnsiballZ_command.py" <<< 41445 1727204187.93121: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-414450s0ylvj1/tmpf9n44gbv" to remote "/root/.ansible/tmp/ansible-tmp-1727204187.8783002-42181-136066352137810/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204187.8783002-42181-136066352137810/AnsiballZ_command.py" <<< 41445 1727204187.93742: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204187.93886: stderr chunk (state=3): >>><<< 41445 1727204187.93889: stdout chunk (state=3): >>><<< 41445 1727204187.93891: done transferring module to remote 41445 1727204187.93893: _low_level_execute_command(): starting 41445 1727204187.93895: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204187.8783002-42181-136066352137810/ /root/.ansible/tmp/ansible-tmp-1727204187.8783002-42181-136066352137810/AnsiballZ_command.py && sleep 0' 41445 1727204187.94968: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204187.94981: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204187.94989: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204187.95003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204187.95017: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 <<< 41445 1727204187.95079: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204187.95122: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204187.95135: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204187.95152: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204187.95206: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204187.96970: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204187.96974: stdout chunk (state=3): >>><<< 41445 1727204187.96982: stderr chunk (state=3): >>><<< 41445 1727204187.96993: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204187.96996: _low_level_execute_command(): starting 41445 1727204187.97001: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204187.8783002-42181-136066352137810/AnsiballZ_command.py && sleep 0' 41445 1727204187.97449: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204187.97452: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204187.97455: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204187.97457: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204187.97522: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204187.97535: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204187.97538: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204187.97561: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204188.13009: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "ethtest0", "type", "veth", "peer", "name", "peerethtest0"], "start": "2024-09-24 14:56:28.123288", "end": "2024-09-24 14:56:28.128291", "delta": "0:00:00.005003", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add ethtest0 type veth peer name peerethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 41445 1727204188.15305: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. <<< 41445 1727204188.15310: stdout chunk (state=3): >>><<< 41445 1727204188.15312: stderr chunk (state=3): >>><<< 41445 1727204188.15333: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "ethtest0", "type", "veth", "peer", "name", "peerethtest0"], "start": "2024-09-24 14:56:28.123288", "end": "2024-09-24 14:56:28.128291", "delta": "0:00:00.005003", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add ethtest0 type veth peer name peerethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. 41445 1727204188.15387: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link add ethtest0 type veth peer name peerethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204187.8783002-42181-136066352137810/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41445 1727204188.15483: _low_level_execute_command(): starting 41445 1727204188.15487: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204187.8783002-42181-136066352137810/ > /dev/null 2>&1 && sleep 0' 41445 1727204188.16098: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration <<< 41445 1727204188.16196: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204188.16218: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204188.16240: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204188.16623: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204188.20919: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204188.20935: stdout chunk (state=3): >>><<< 41445 1727204188.20955: stderr chunk (state=3): >>><<< 41445 1727204188.20977: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204188.20989: handler run complete 41445 1727204188.21181: Evaluated conditional (False): False 41445 1727204188.21185: attempt loop complete, returning result 41445 1727204188.21188: variable 'item' from source: unknown 41445 1727204188.21190: variable 'item' from source: unknown ok: [managed-node3] => (item=ip link add ethtest0 type veth peer name peerethtest0) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "add", "ethtest0", "type", "veth", "peer", "name", "peerethtest0" ], "delta": "0:00:00.005003", "end": "2024-09-24 14:56:28.128291", "item": "ip link add ethtest0 type veth peer name peerethtest0", "rc": 0, "start": "2024-09-24 14:56:28.123288" } 41445 1727204188.21339: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204188.21343: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204188.21345: variable 'omit' from source: magic vars 41445 1727204188.21439: variable 'ansible_distribution_major_version' from source: facts 41445 1727204188.21445: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204188.21595: variable 'type' from source: set_fact 41445 1727204188.21598: variable 'state' from source: include params 41445 1727204188.21601: variable 'interface' from source: set_fact 41445 1727204188.21614: variable 'current_interfaces' from source: set_fact 41445 1727204188.21616: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 41445 1727204188.21619: variable 'omit' from source: magic vars 41445 1727204188.21629: variable 'omit' from source: magic vars 41445 1727204188.21655: variable 'item' from source: unknown 41445 1727204188.21716: variable 'item' from source: unknown 41445 1727204188.21780: variable 'omit' from source: magic vars 41445 1727204188.21783: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41445 1727204188.21786: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204188.21788: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204188.21790: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41445 1727204188.21792: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204188.21794: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204188.21819: Set connection var ansible_shell_executable to /bin/sh 41445 1727204188.21822: Set connection var ansible_shell_type to sh 41445 1727204188.21832: Set connection var ansible_pipelining to False 41445 1727204188.21835: Set connection var ansible_timeout to 10 41445 1727204188.21837: Set connection var ansible_connection to ssh 41445 1727204188.21842: Set connection var ansible_module_compression to ZIP_DEFLATED 41445 1727204188.21857: variable 'ansible_shell_executable' from source: unknown 41445 1727204188.21860: variable 'ansible_connection' from source: unknown 41445 1727204188.21862: variable 'ansible_module_compression' from source: unknown 41445 1727204188.21865: variable 'ansible_shell_type' from source: unknown 41445 1727204188.21867: variable 'ansible_shell_executable' from source: unknown 41445 1727204188.21870: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204188.21874: variable 'ansible_pipelining' from source: unknown 41445 1727204188.21885: variable 'ansible_timeout' from source: unknown 41445 1727204188.21888: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204188.21951: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41445 1727204188.21960: variable 'omit' from source: magic vars 41445 1727204188.21963: starting attempt loop 41445 1727204188.21966: running the handler 41445 1727204188.21973: _low_level_execute_command(): starting 41445 1727204188.21979: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41445 1727204188.22418: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204188.22421: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found <<< 41445 1727204188.22423: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 41445 1727204188.22427: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204188.22429: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204188.22474: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204188.22486: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204188.22488: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204188.22521: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204188.24095: stdout chunk (state=3): >>>/root <<< 41445 1727204188.24260: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204188.24266: stdout chunk (state=3): >>><<< 41445 1727204188.24269: stderr chunk (state=3): >>><<< 41445 1727204188.24369: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204188.24372: _low_level_execute_command(): starting 41445 1727204188.24375: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204188.2429435-42181-223744731128192 `" && echo ansible-tmp-1727204188.2429435-42181-223744731128192="` echo /root/.ansible/tmp/ansible-tmp-1727204188.2429435-42181-223744731128192 `" ) && sleep 0' 41445 1727204188.24955: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204188.25050: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204188.25063: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204188.25125: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204188.26971: stdout chunk (state=3): >>>ansible-tmp-1727204188.2429435-42181-223744731128192=/root/.ansible/tmp/ansible-tmp-1727204188.2429435-42181-223744731128192 <<< 41445 1727204188.27134: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204188.27161: stderr chunk (state=3): >>><<< 41445 1727204188.27164: stdout chunk (state=3): >>><<< 41445 1727204188.27381: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204188.2429435-42181-223744731128192=/root/.ansible/tmp/ansible-tmp-1727204188.2429435-42181-223744731128192 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204188.27384: variable 'ansible_module_compression' from source: unknown 41445 1727204188.27387: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-414450s0ylvj1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 41445 1727204188.27389: variable 'ansible_facts' from source: unknown 41445 1727204188.27391: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204188.2429435-42181-223744731128192/AnsiballZ_command.py 41445 1727204188.27523: Sending initial data 41445 1727204188.27533: Sent initial data (156 bytes) 41445 1727204188.28102: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204188.28130: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204188.28143: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204188.28173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204188.28275: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204188.28396: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204188.28421: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204188.28611: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204188.30187: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 debug2: Sending SSH2_FXP_REALPATH "." debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204188.2429435-42181-223744731128192/AnsiballZ_command.py" <<< 41445 1727204188.30195: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-414450s0ylvj1/tmpq4upqq20 /root/.ansible/tmp/ansible-tmp-1727204188.2429435-42181-223744731128192/AnsiballZ_command.py <<< 41445 1727204188.30251: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-414450s0ylvj1/tmpq4upqq20" to remote "/root/.ansible/tmp/ansible-tmp-1727204188.2429435-42181-223744731128192/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204188.2429435-42181-223744731128192/AnsiballZ_command.py" <<< 41445 1727204188.32149: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204188.32353: stderr chunk (state=3): >>><<< 41445 1727204188.32356: stdout chunk (state=3): >>><<< 41445 1727204188.32358: done transferring module to remote 41445 1727204188.32361: _low_level_execute_command(): starting 41445 1727204188.32491: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204188.2429435-42181-223744731128192/ /root/.ansible/tmp/ansible-tmp-1727204188.2429435-42181-223744731128192/AnsiballZ_command.py && sleep 0' 41445 1727204188.33628: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204188.33631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found <<< 41445 1727204188.33633: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204188.33635: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204188.33793: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204188.33815: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204188.35503: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204188.35556: stderr chunk (state=3): >>><<< 41445 1727204188.35569: stdout chunk (state=3): >>><<< 41445 1727204188.35598: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204188.35636: _low_level_execute_command(): starting 41445 1727204188.35647: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204188.2429435-42181-223744731128192/AnsiballZ_command.py && sleep 0' 41445 1727204188.36477: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204188.36493: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204188.36512: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204188.36530: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204188.36547: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 <<< 41445 1727204188.36560: stderr chunk (state=3): >>>debug2: match not found <<< 41445 1727204188.36574: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204188.36601: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41445 1727204188.36689: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204188.36712: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204188.36731: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204188.36894: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204188.52183: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerethtest0", "up"], "start": "2024-09-24 14:56:28.517298", "end": "2024-09-24 14:56:28.520681", "delta": "0:00:00.003383", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 41445 1727204188.53554: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. <<< 41445 1727204188.53577: stderr chunk (state=3): >>><<< 41445 1727204188.53580: stdout chunk (state=3): >>><<< 41445 1727204188.53597: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerethtest0", "up"], "start": "2024-09-24 14:56:28.517298", "end": "2024-09-24 14:56:28.520681", "delta": "0:00:00.003383", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. 41445 1727204188.53624: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set peerethtest0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204188.2429435-42181-223744731128192/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41445 1727204188.53630: _low_level_execute_command(): starting 41445 1727204188.53636: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204188.2429435-42181-223744731128192/ > /dev/null 2>&1 && sleep 0' 41445 1727204188.54064: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204188.54067: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204188.54070: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration <<< 41445 1727204188.54072: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204188.54074: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204188.54133: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204188.54142: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204188.54166: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204188.55955: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204188.55984: stderr chunk (state=3): >>><<< 41445 1727204188.55988: stdout chunk (state=3): >>><<< 41445 1727204188.56003: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204188.56006: handler run complete 41445 1727204188.56028: Evaluated conditional (False): False 41445 1727204188.56035: attempt loop complete, returning result 41445 1727204188.56050: variable 'item' from source: unknown 41445 1727204188.56111: variable 'item' from source: unknown ok: [managed-node3] => (item=ip link set peerethtest0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "peerethtest0", "up" ], "delta": "0:00:00.003383", "end": "2024-09-24 14:56:28.520681", "item": "ip link set peerethtest0 up", "rc": 0, "start": "2024-09-24 14:56:28.517298" } 41445 1727204188.56231: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204188.56234: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204188.56237: variable 'omit' from source: magic vars 41445 1727204188.56336: variable 'ansible_distribution_major_version' from source: facts 41445 1727204188.56340: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204188.56465: variable 'type' from source: set_fact 41445 1727204188.56469: variable 'state' from source: include params 41445 1727204188.56471: variable 'interface' from source: set_fact 41445 1727204188.56474: variable 'current_interfaces' from source: set_fact 41445 1727204188.56478: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 41445 1727204188.56482: variable 'omit' from source: magic vars 41445 1727204188.56493: variable 'omit' from source: magic vars 41445 1727204188.56522: variable 'item' from source: unknown 41445 1727204188.56565: variable 'item' from source: unknown 41445 1727204188.56579: variable 'omit' from source: magic vars 41445 1727204188.56595: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41445 1727204188.56602: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204188.56607: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204188.56620: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41445 1727204188.56623: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204188.56625: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204188.56670: Set connection var ansible_shell_executable to /bin/sh 41445 1727204188.56673: Set connection var ansible_shell_type to sh 41445 1727204188.56678: Set connection var ansible_pipelining to False 41445 1727204188.56692: Set connection var ansible_timeout to 10 41445 1727204188.56695: Set connection var ansible_connection to ssh 41445 1727204188.56700: Set connection var ansible_module_compression to ZIP_DEFLATED 41445 1727204188.56718: variable 'ansible_shell_executable' from source: unknown 41445 1727204188.56721: variable 'ansible_connection' from source: unknown 41445 1727204188.56724: variable 'ansible_module_compression' from source: unknown 41445 1727204188.56726: variable 'ansible_shell_type' from source: unknown 41445 1727204188.56728: variable 'ansible_shell_executable' from source: unknown 41445 1727204188.56730: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204188.56734: variable 'ansible_pipelining' from source: unknown 41445 1727204188.56736: variable 'ansible_timeout' from source: unknown 41445 1727204188.56740: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204188.56807: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41445 1727204188.56816: variable 'omit' from source: magic vars 41445 1727204188.56821: starting attempt loop 41445 1727204188.56823: running the handler 41445 1727204188.56829: _low_level_execute_command(): starting 41445 1727204188.56832: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41445 1727204188.57280: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204188.57283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found <<< 41445 1727204188.57286: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204188.57288: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204188.57290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204188.57342: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204188.57345: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204188.57385: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204188.58926: stdout chunk (state=3): >>>/root <<< 41445 1727204188.59024: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204188.59052: stderr chunk (state=3): >>><<< 41445 1727204188.59055: stdout chunk (state=3): >>><<< 41445 1727204188.59072: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204188.59084: _low_level_execute_command(): starting 41445 1727204188.59088: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204188.5907195-42181-66149674175491 `" && echo ansible-tmp-1727204188.5907195-42181-66149674175491="` echo /root/.ansible/tmp/ansible-tmp-1727204188.5907195-42181-66149674175491 `" ) && sleep 0' 41445 1727204188.59523: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204188.59526: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204188.59536: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204188.59579: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204188.59594: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204188.59637: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204188.61459: stdout chunk (state=3): >>>ansible-tmp-1727204188.5907195-42181-66149674175491=/root/.ansible/tmp/ansible-tmp-1727204188.5907195-42181-66149674175491 <<< 41445 1727204188.61560: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204188.61592: stderr chunk (state=3): >>><<< 41445 1727204188.61595: stdout chunk (state=3): >>><<< 41445 1727204188.61609: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204188.5907195-42181-66149674175491=/root/.ansible/tmp/ansible-tmp-1727204188.5907195-42181-66149674175491 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204188.61629: variable 'ansible_module_compression' from source: unknown 41445 1727204188.61656: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-414450s0ylvj1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 41445 1727204188.61671: variable 'ansible_facts' from source: unknown 41445 1727204188.61722: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204188.5907195-42181-66149674175491/AnsiballZ_command.py 41445 1727204188.61813: Sending initial data 41445 1727204188.61817: Sent initial data (155 bytes) 41445 1727204188.62426: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204188.62454: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204188.62457: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204188.62500: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204188.63983: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41445 1727204188.64016: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41445 1727204188.64048: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-414450s0ylvj1/tmpwsf7q9_y /root/.ansible/tmp/ansible-tmp-1727204188.5907195-42181-66149674175491/AnsiballZ_command.py <<< 41445 1727204188.64053: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204188.5907195-42181-66149674175491/AnsiballZ_command.py" <<< 41445 1727204188.64081: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-414450s0ylvj1/tmpwsf7q9_y" to remote "/root/.ansible/tmp/ansible-tmp-1727204188.5907195-42181-66149674175491/AnsiballZ_command.py" <<< 41445 1727204188.64084: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204188.5907195-42181-66149674175491/AnsiballZ_command.py" <<< 41445 1727204188.64580: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204188.64625: stderr chunk (state=3): >>><<< 41445 1727204188.64628: stdout chunk (state=3): >>><<< 41445 1727204188.64665: done transferring module to remote 41445 1727204188.64679: _low_level_execute_command(): starting 41445 1727204188.64682: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204188.5907195-42181-66149674175491/ /root/.ansible/tmp/ansible-tmp-1727204188.5907195-42181-66149674175491/AnsiballZ_command.py && sleep 0' 41445 1727204188.65126: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204188.65129: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found <<< 41445 1727204188.65132: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204188.65134: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204188.65191: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204188.65196: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204188.65198: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204188.65228: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204188.66908: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204188.66935: stderr chunk (state=3): >>><<< 41445 1727204188.66938: stdout chunk (state=3): >>><<< 41445 1727204188.66952: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204188.66955: _low_level_execute_command(): starting 41445 1727204188.66966: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204188.5907195-42181-66149674175491/AnsiballZ_command.py && sleep 0' 41445 1727204188.67408: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204188.67412: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204188.67414: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration <<< 41445 1727204188.67416: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found <<< 41445 1727204188.67418: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204188.67471: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204188.67479: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204188.67482: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204188.67518: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204188.83148: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "ethtest0", "up"], "start": "2024-09-24 14:56:28.825778", "end": "2024-09-24 14:56:28.829341", "delta": "0:00:00.003563", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set ethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 41445 1727204188.84462: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. <<< 41445 1727204188.84489: stderr chunk (state=3): >>><<< 41445 1727204188.84492: stdout chunk (state=3): >>><<< 41445 1727204188.84507: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "ethtest0", "up"], "start": "2024-09-24 14:56:28.825778", "end": "2024-09-24 14:56:28.829341", "delta": "0:00:00.003563", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set ethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. 41445 1727204188.84532: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set ethtest0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204188.5907195-42181-66149674175491/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41445 1727204188.84537: _low_level_execute_command(): starting 41445 1727204188.84542: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204188.5907195-42181-66149674175491/ > /dev/null 2>&1 && sleep 0' 41445 1727204188.84967: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204188.84999: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204188.85002: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found <<< 41445 1727204188.85004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204188.85006: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204188.85012: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204188.85082: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204188.85090: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204188.85110: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204188.86926: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204188.86930: stdout chunk (state=3): >>><<< 41445 1727204188.86932: stderr chunk (state=3): >>><<< 41445 1727204188.87010: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204188.87014: handler run complete 41445 1727204188.87016: Evaluated conditional (False): False 41445 1727204188.87018: attempt loop complete, returning result 41445 1727204188.87057: variable 'item' from source: unknown 41445 1727204188.87224: variable 'item' from source: unknown ok: [managed-node3] => (item=ip link set ethtest0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "ethtest0", "up" ], "delta": "0:00:00.003563", "end": "2024-09-24 14:56:28.829341", "item": "ip link set ethtest0 up", "rc": 0, "start": "2024-09-24 14:56:28.825778" } 41445 1727204188.87400: dumping result to json 41445 1727204188.87403: done dumping result, returning 41445 1727204188.87405: done running TaskExecutor() for managed-node3/TASK: Create veth interface ethtest0 [028d2410-947f-bf02-eee4-00000000026e] 41445 1727204188.87407: sending task result for task 028d2410-947f-bf02-eee4-00000000026e 41445 1727204188.87456: done sending task result for task 028d2410-947f-bf02-eee4-00000000026e 41445 1727204188.87459: WORKER PROCESS EXITING 41445 1727204188.87568: no more pending results, returning what we have 41445 1727204188.87571: results queue empty 41445 1727204188.87572: checking for any_errors_fatal 41445 1727204188.87579: done checking for any_errors_fatal 41445 1727204188.87580: checking for max_fail_percentage 41445 1727204188.87582: done checking for max_fail_percentage 41445 1727204188.87582: checking to see if all hosts have failed and the running result is not ok 41445 1727204188.87583: done checking to see if all hosts have failed 41445 1727204188.87584: getting the remaining hosts for this loop 41445 1727204188.87585: done getting the remaining hosts for this loop 41445 1727204188.87588: getting the next task for host managed-node3 41445 1727204188.87593: done getting next task for host managed-node3 41445 1727204188.87631: ^ task is: TASK: Set up veth as managed by NetworkManager 41445 1727204188.87634: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204188.87637: getting variables 41445 1727204188.87639: in VariableManager get_vars() 41445 1727204188.87685: Calling all_inventory to load vars for managed-node3 41445 1727204188.87688: Calling groups_inventory to load vars for managed-node3 41445 1727204188.87690: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204188.87715: Calling all_plugins_play to load vars for managed-node3 41445 1727204188.87718: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204188.87722: Calling groups_plugins_play to load vars for managed-node3 41445 1727204188.87966: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204188.88458: done with get_vars() 41445 1727204188.88636: done getting variables 41445 1727204188.88816: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Tuesday 24 September 2024 14:56:28 -0400 (0:00:01.065) 0:00:07.676 ***** 41445 1727204188.88845: entering _queue_task() for managed-node3/command 41445 1727204188.89458: worker is 1 (out of 1 available) 41445 1727204188.89472: exiting _queue_task() for managed-node3/command 41445 1727204188.89518: done queuing things up, now waiting for results queue to drain 41445 1727204188.89520: waiting for pending results... 41445 1727204188.89785: running TaskExecutor() for managed-node3/TASK: Set up veth as managed by NetworkManager 41445 1727204188.89902: in run() - task 028d2410-947f-bf02-eee4-00000000026f 41445 1727204188.89906: variable 'ansible_search_path' from source: unknown 41445 1727204188.89912: variable 'ansible_search_path' from source: unknown 41445 1727204188.89943: calling self._execute() 41445 1727204188.90047: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204188.90182: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204188.90186: variable 'omit' from source: magic vars 41445 1727204188.90467: variable 'ansible_distribution_major_version' from source: facts 41445 1727204188.90488: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204188.90684: variable 'type' from source: set_fact 41445 1727204188.90695: variable 'state' from source: include params 41445 1727204188.90705: Evaluated conditional (type == 'veth' and state == 'present'): True 41445 1727204188.90720: variable 'omit' from source: magic vars 41445 1727204188.90768: variable 'omit' from source: magic vars 41445 1727204188.90879: variable 'interface' from source: set_fact 41445 1727204188.90904: variable 'omit' from source: magic vars 41445 1727204188.90957: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41445 1727204188.91007: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41445 1727204188.91036: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41445 1727204188.91059: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204188.91088: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204188.91196: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41445 1727204188.91200: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204188.91202: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204188.91248: Set connection var ansible_shell_executable to /bin/sh 41445 1727204188.91260: Set connection var ansible_shell_type to sh 41445 1727204188.91273: Set connection var ansible_pipelining to False 41445 1727204188.91289: Set connection var ansible_timeout to 10 41445 1727204188.91297: Set connection var ansible_connection to ssh 41445 1727204188.91783: Set connection var ansible_module_compression to ZIP_DEFLATED 41445 1727204188.91787: variable 'ansible_shell_executable' from source: unknown 41445 1727204188.91789: variable 'ansible_connection' from source: unknown 41445 1727204188.91791: variable 'ansible_module_compression' from source: unknown 41445 1727204188.91793: variable 'ansible_shell_type' from source: unknown 41445 1727204188.91796: variable 'ansible_shell_executable' from source: unknown 41445 1727204188.91798: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204188.91800: variable 'ansible_pipelining' from source: unknown 41445 1727204188.91802: variable 'ansible_timeout' from source: unknown 41445 1727204188.91804: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204188.91807: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41445 1727204188.91893: variable 'omit' from source: magic vars 41445 1727204188.91906: starting attempt loop 41445 1727204188.92005: running the handler 41445 1727204188.92030: _low_level_execute_command(): starting 41445 1727204188.92044: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41445 1727204188.93313: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204188.93334: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204188.93386: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204188.93402: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found <<< 41445 1727204188.93422: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204188.93566: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204188.93591: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204188.93618: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204188.93790: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204188.95325: stdout chunk (state=3): >>>/root <<< 41445 1727204188.95466: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204188.95481: stdout chunk (state=3): >>><<< 41445 1727204188.95495: stderr chunk (state=3): >>><<< 41445 1727204188.95527: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204188.95550: _low_level_execute_command(): starting 41445 1727204188.95560: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204188.9553728-42370-74910895315278 `" && echo ansible-tmp-1727204188.9553728-42370-74910895315278="` echo /root/.ansible/tmp/ansible-tmp-1727204188.9553728-42370-74910895315278 `" ) && sleep 0' 41445 1727204188.96184: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204188.96198: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204188.96221: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204188.96243: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204188.96260: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 <<< 41445 1727204188.96270: stderr chunk (state=3): >>>debug2: match not found <<< 41445 1727204188.96286: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204188.96357: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204188.96396: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204188.96418: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204188.96442: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204188.96565: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204188.98393: stdout chunk (state=3): >>>ansible-tmp-1727204188.9553728-42370-74910895315278=/root/.ansible/tmp/ansible-tmp-1727204188.9553728-42370-74910895315278 <<< 41445 1727204188.98561: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204188.98564: stdout chunk (state=3): >>><<< 41445 1727204188.98566: stderr chunk (state=3): >>><<< 41445 1727204188.98585: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204188.9553728-42370-74910895315278=/root/.ansible/tmp/ansible-tmp-1727204188.9553728-42370-74910895315278 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204188.98619: variable 'ansible_module_compression' from source: unknown 41445 1727204188.98750: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-414450s0ylvj1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 41445 1727204188.98753: variable 'ansible_facts' from source: unknown 41445 1727204188.98819: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204188.9553728-42370-74910895315278/AnsiballZ_command.py 41445 1727204188.98990: Sending initial data 41445 1727204188.99000: Sent initial data (155 bytes) 41445 1727204188.99670: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204188.99691: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204188.99708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204188.99748: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204188.99761: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41445 1727204188.99859: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204188.99882: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204188.99960: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204189.01487: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41445 1727204189.01537: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41445 1727204189.01582: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-414450s0ylvj1/tmprg4y_70s /root/.ansible/tmp/ansible-tmp-1727204188.9553728-42370-74910895315278/AnsiballZ_command.py <<< 41445 1727204189.01585: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204188.9553728-42370-74910895315278/AnsiballZ_command.py" <<< 41445 1727204189.01614: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-414450s0ylvj1/tmprg4y_70s" to remote "/root/.ansible/tmp/ansible-tmp-1727204188.9553728-42370-74910895315278/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204188.9553728-42370-74910895315278/AnsiballZ_command.py" <<< 41445 1727204189.02353: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204189.02387: stderr chunk (state=3): >>><<< 41445 1727204189.02502: stdout chunk (state=3): >>><<< 41445 1727204189.02506: done transferring module to remote 41445 1727204189.02508: _low_level_execute_command(): starting 41445 1727204189.02510: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204188.9553728-42370-74910895315278/ /root/.ansible/tmp/ansible-tmp-1727204188.9553728-42370-74910895315278/AnsiballZ_command.py && sleep 0' 41445 1727204189.03067: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204189.03088: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204189.03129: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204189.03217: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204189.03246: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204189.03308: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204189.05080: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204189.05084: stdout chunk (state=3): >>><<< 41445 1727204189.05097: stderr chunk (state=3): >>><<< 41445 1727204189.05118: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204189.05204: _low_level_execute_command(): starting 41445 1727204189.05207: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204188.9553728-42370-74910895315278/AnsiballZ_command.py && sleep 0' 41445 1727204189.05883: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204189.05931: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204189.05951: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204189.05971: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204189.06211: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204189.23311: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "ethtest0", "managed", "true"], "start": "2024-09-24 14:56:29.210546", "end": "2024-09-24 14:56:29.229957", "delta": "0:00:00.019411", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set ethtest0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 41445 1727204189.24871: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. <<< 41445 1727204189.24879: stdout chunk (state=3): >>><<< 41445 1727204189.24882: stderr chunk (state=3): >>><<< 41445 1727204189.24885: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "ethtest0", "managed", "true"], "start": "2024-09-24 14:56:29.210546", "end": "2024-09-24 14:56:29.229957", "delta": "0:00:00.019411", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set ethtest0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. 41445 1727204189.24888: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli d set ethtest0 managed true', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204188.9553728-42370-74910895315278/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41445 1727204189.24890: _low_level_execute_command(): starting 41445 1727204189.24892: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204188.9553728-42370-74910895315278/ > /dev/null 2>&1 && sleep 0' 41445 1727204189.25483: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204189.25495: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204189.25509: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204189.25525: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204189.25565: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204189.25593: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204189.25682: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204189.25692: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204189.25703: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204189.25760: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204189.27663: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204189.27674: stdout chunk (state=3): >>><<< 41445 1727204189.27698: stderr chunk (state=3): >>><<< 41445 1727204189.27722: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204189.27734: handler run complete 41445 1727204189.27880: Evaluated conditional (False): False 41445 1727204189.27883: attempt loop complete, returning result 41445 1727204189.27886: _execute() done 41445 1727204189.27888: dumping result to json 41445 1727204189.27890: done dumping result, returning 41445 1727204189.27891: done running TaskExecutor() for managed-node3/TASK: Set up veth as managed by NetworkManager [028d2410-947f-bf02-eee4-00000000026f] 41445 1727204189.27893: sending task result for task 028d2410-947f-bf02-eee4-00000000026f 41445 1727204189.27970: done sending task result for task 028d2410-947f-bf02-eee4-00000000026f 41445 1727204189.27973: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": [ "nmcli", "d", "set", "ethtest0", "managed", "true" ], "delta": "0:00:00.019411", "end": "2024-09-24 14:56:29.229957", "rc": 0, "start": "2024-09-24 14:56:29.210546" } 41445 1727204189.28054: no more pending results, returning what we have 41445 1727204189.28057: results queue empty 41445 1727204189.28058: checking for any_errors_fatal 41445 1727204189.28070: done checking for any_errors_fatal 41445 1727204189.28071: checking for max_fail_percentage 41445 1727204189.28073: done checking for max_fail_percentage 41445 1727204189.28074: checking to see if all hosts have failed and the running result is not ok 41445 1727204189.28075: done checking to see if all hosts have failed 41445 1727204189.28079: getting the remaining hosts for this loop 41445 1727204189.28080: done getting the remaining hosts for this loop 41445 1727204189.28281: getting the next task for host managed-node3 41445 1727204189.28288: done getting next task for host managed-node3 41445 1727204189.28291: ^ task is: TASK: Delete veth interface {{ interface }} 41445 1727204189.28294: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204189.28297: getting variables 41445 1727204189.28299: in VariableManager get_vars() 41445 1727204189.28340: Calling all_inventory to load vars for managed-node3 41445 1727204189.28343: Calling groups_inventory to load vars for managed-node3 41445 1727204189.28345: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204189.28355: Calling all_plugins_play to load vars for managed-node3 41445 1727204189.28358: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204189.28361: Calling groups_plugins_play to load vars for managed-node3 41445 1727204189.28655: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204189.28791: done with get_vars() 41445 1727204189.28799: done getting variables 41445 1727204189.28848: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 41445 1727204189.28936: variable 'interface' from source: set_fact TASK [Delete veth interface ethtest0] ****************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Tuesday 24 September 2024 14:56:29 -0400 (0:00:00.401) 0:00:08.077 ***** 41445 1727204189.28958: entering _queue_task() for managed-node3/command 41445 1727204189.29166: worker is 1 (out of 1 available) 41445 1727204189.29182: exiting _queue_task() for managed-node3/command 41445 1727204189.29194: done queuing things up, now waiting for results queue to drain 41445 1727204189.29196: waiting for pending results... 41445 1727204189.29354: running TaskExecutor() for managed-node3/TASK: Delete veth interface ethtest0 41445 1727204189.29420: in run() - task 028d2410-947f-bf02-eee4-000000000270 41445 1727204189.29543: variable 'ansible_search_path' from source: unknown 41445 1727204189.29546: variable 'ansible_search_path' from source: unknown 41445 1727204189.29549: calling self._execute() 41445 1727204189.29552: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204189.29554: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204189.29556: variable 'omit' from source: magic vars 41445 1727204189.29806: variable 'ansible_distribution_major_version' from source: facts 41445 1727204189.29818: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204189.29949: variable 'type' from source: set_fact 41445 1727204189.29954: variable 'state' from source: include params 41445 1727204189.29957: variable 'interface' from source: set_fact 41445 1727204189.29960: variable 'current_interfaces' from source: set_fact 41445 1727204189.29969: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): False 41445 1727204189.29972: when evaluation is False, skipping this task 41445 1727204189.29974: _execute() done 41445 1727204189.29978: dumping result to json 41445 1727204189.29981: done dumping result, returning 41445 1727204189.29983: done running TaskExecutor() for managed-node3/TASK: Delete veth interface ethtest0 [028d2410-947f-bf02-eee4-000000000270] 41445 1727204189.29994: sending task result for task 028d2410-947f-bf02-eee4-000000000270 41445 1727204189.30066: done sending task result for task 028d2410-947f-bf02-eee4-000000000270 41445 1727204189.30069: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "type == 'veth' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 41445 1727204189.30141: no more pending results, returning what we have 41445 1727204189.30144: results queue empty 41445 1727204189.30145: checking for any_errors_fatal 41445 1727204189.30152: done checking for any_errors_fatal 41445 1727204189.30152: checking for max_fail_percentage 41445 1727204189.30154: done checking for max_fail_percentage 41445 1727204189.30154: checking to see if all hosts have failed and the running result is not ok 41445 1727204189.30155: done checking to see if all hosts have failed 41445 1727204189.30156: getting the remaining hosts for this loop 41445 1727204189.30157: done getting the remaining hosts for this loop 41445 1727204189.30161: getting the next task for host managed-node3 41445 1727204189.30166: done getting next task for host managed-node3 41445 1727204189.30168: ^ task is: TASK: Create dummy interface {{ interface }} 41445 1727204189.30171: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204189.30174: getting variables 41445 1727204189.30177: in VariableManager get_vars() 41445 1727204189.30211: Calling all_inventory to load vars for managed-node3 41445 1727204189.30214: Calling groups_inventory to load vars for managed-node3 41445 1727204189.30216: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204189.30225: Calling all_plugins_play to load vars for managed-node3 41445 1727204189.30227: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204189.30230: Calling groups_plugins_play to load vars for managed-node3 41445 1727204189.30372: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204189.30629: done with get_vars() 41445 1727204189.30639: done getting variables 41445 1727204189.30705: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 41445 1727204189.30819: variable 'interface' from source: set_fact TASK [Create dummy interface ethtest0] ***************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Tuesday 24 September 2024 14:56:29 -0400 (0:00:00.018) 0:00:08.096 ***** 41445 1727204189.30849: entering _queue_task() for managed-node3/command 41445 1727204189.31113: worker is 1 (out of 1 available) 41445 1727204189.31289: exiting _queue_task() for managed-node3/command 41445 1727204189.31301: done queuing things up, now waiting for results queue to drain 41445 1727204189.31302: waiting for pending results... 41445 1727204189.31434: running TaskExecutor() for managed-node3/TASK: Create dummy interface ethtest0 41445 1727204189.31548: in run() - task 028d2410-947f-bf02-eee4-000000000271 41445 1727204189.31561: variable 'ansible_search_path' from source: unknown 41445 1727204189.31564: variable 'ansible_search_path' from source: unknown 41445 1727204189.31631: calling self._execute() 41445 1727204189.31682: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204189.31686: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204189.31694: variable 'omit' from source: magic vars 41445 1727204189.31960: variable 'ansible_distribution_major_version' from source: facts 41445 1727204189.31972: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204189.32105: variable 'type' from source: set_fact 41445 1727204189.32108: variable 'state' from source: include params 41445 1727204189.32115: variable 'interface' from source: set_fact 41445 1727204189.32118: variable 'current_interfaces' from source: set_fact 41445 1727204189.32126: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 41445 1727204189.32129: when evaluation is False, skipping this task 41445 1727204189.32132: _execute() done 41445 1727204189.32134: dumping result to json 41445 1727204189.32136: done dumping result, returning 41445 1727204189.32141: done running TaskExecutor() for managed-node3/TASK: Create dummy interface ethtest0 [028d2410-947f-bf02-eee4-000000000271] 41445 1727204189.32147: sending task result for task 028d2410-947f-bf02-eee4-000000000271 41445 1727204189.32223: done sending task result for task 028d2410-947f-bf02-eee4-000000000271 41445 1727204189.32226: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 41445 1727204189.32271: no more pending results, returning what we have 41445 1727204189.32275: results queue empty 41445 1727204189.32278: checking for any_errors_fatal 41445 1727204189.32283: done checking for any_errors_fatal 41445 1727204189.32284: checking for max_fail_percentage 41445 1727204189.32286: done checking for max_fail_percentage 41445 1727204189.32287: checking to see if all hosts have failed and the running result is not ok 41445 1727204189.32287: done checking to see if all hosts have failed 41445 1727204189.32288: getting the remaining hosts for this loop 41445 1727204189.32289: done getting the remaining hosts for this loop 41445 1727204189.32293: getting the next task for host managed-node3 41445 1727204189.32299: done getting next task for host managed-node3 41445 1727204189.32301: ^ task is: TASK: Delete dummy interface {{ interface }} 41445 1727204189.32304: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204189.32308: getting variables 41445 1727204189.32310: in VariableManager get_vars() 41445 1727204189.32345: Calling all_inventory to load vars for managed-node3 41445 1727204189.32347: Calling groups_inventory to load vars for managed-node3 41445 1727204189.32349: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204189.32358: Calling all_plugins_play to load vars for managed-node3 41445 1727204189.32360: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204189.32362: Calling groups_plugins_play to load vars for managed-node3 41445 1727204189.32507: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204189.32637: done with get_vars() 41445 1727204189.32644: done getting variables 41445 1727204189.32688: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 41445 1727204189.32765: variable 'interface' from source: set_fact TASK [Delete dummy interface ethtest0] ***************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Tuesday 24 September 2024 14:56:29 -0400 (0:00:00.019) 0:00:08.115 ***** 41445 1727204189.32788: entering _queue_task() for managed-node3/command 41445 1727204189.32980: worker is 1 (out of 1 available) 41445 1727204189.32995: exiting _queue_task() for managed-node3/command 41445 1727204189.33008: done queuing things up, now waiting for results queue to drain 41445 1727204189.33009: waiting for pending results... 41445 1727204189.33156: running TaskExecutor() for managed-node3/TASK: Delete dummy interface ethtest0 41445 1727204189.33222: in run() - task 028d2410-947f-bf02-eee4-000000000272 41445 1727204189.33239: variable 'ansible_search_path' from source: unknown 41445 1727204189.33243: variable 'ansible_search_path' from source: unknown 41445 1727204189.33267: calling self._execute() 41445 1727204189.33332: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204189.33338: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204189.33347: variable 'omit' from source: magic vars 41445 1727204189.33781: variable 'ansible_distribution_major_version' from source: facts 41445 1727204189.33784: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204189.33832: variable 'type' from source: set_fact 41445 1727204189.33842: variable 'state' from source: include params 41445 1727204189.33853: variable 'interface' from source: set_fact 41445 1727204189.33862: variable 'current_interfaces' from source: set_fact 41445 1727204189.33875: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 41445 1727204189.33884: when evaluation is False, skipping this task 41445 1727204189.33891: _execute() done 41445 1727204189.33898: dumping result to json 41445 1727204189.33907: done dumping result, returning 41445 1727204189.33916: done running TaskExecutor() for managed-node3/TASK: Delete dummy interface ethtest0 [028d2410-947f-bf02-eee4-000000000272] 41445 1727204189.33926: sending task result for task 028d2410-947f-bf02-eee4-000000000272 skipping: [managed-node3] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 41445 1727204189.34060: no more pending results, returning what we have 41445 1727204189.34064: results queue empty 41445 1727204189.34065: checking for any_errors_fatal 41445 1727204189.34071: done checking for any_errors_fatal 41445 1727204189.34073: checking for max_fail_percentage 41445 1727204189.34074: done checking for max_fail_percentage 41445 1727204189.34078: checking to see if all hosts have failed and the running result is not ok 41445 1727204189.34079: done checking to see if all hosts have failed 41445 1727204189.34080: getting the remaining hosts for this loop 41445 1727204189.34081: done getting the remaining hosts for this loop 41445 1727204189.34085: getting the next task for host managed-node3 41445 1727204189.34092: done getting next task for host managed-node3 41445 1727204189.34094: ^ task is: TASK: Create tap interface {{ interface }} 41445 1727204189.34097: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204189.34101: getting variables 41445 1727204189.34103: in VariableManager get_vars() 41445 1727204189.34149: Calling all_inventory to load vars for managed-node3 41445 1727204189.34152: Calling groups_inventory to load vars for managed-node3 41445 1727204189.34154: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204189.34167: Calling all_plugins_play to load vars for managed-node3 41445 1727204189.34169: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204189.34171: Calling groups_plugins_play to load vars for managed-node3 41445 1727204189.34361: done sending task result for task 028d2410-947f-bf02-eee4-000000000272 41445 1727204189.34364: WORKER PROCESS EXITING 41445 1727204189.34604: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204189.34812: done with get_vars() 41445 1727204189.34823: done getting variables 41445 1727204189.34879: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 41445 1727204189.35065: variable 'interface' from source: set_fact TASK [Create tap interface ethtest0] ******************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Tuesday 24 September 2024 14:56:29 -0400 (0:00:00.023) 0:00:08.138 ***** 41445 1727204189.35096: entering _queue_task() for managed-node3/command 41445 1727204189.35333: worker is 1 (out of 1 available) 41445 1727204189.35346: exiting _queue_task() for managed-node3/command 41445 1727204189.35356: done queuing things up, now waiting for results queue to drain 41445 1727204189.35357: waiting for pending results... 41445 1727204189.35506: running TaskExecutor() for managed-node3/TASK: Create tap interface ethtest0 41445 1727204189.35572: in run() - task 028d2410-947f-bf02-eee4-000000000273 41445 1727204189.35587: variable 'ansible_search_path' from source: unknown 41445 1727204189.35591: variable 'ansible_search_path' from source: unknown 41445 1727204189.35623: calling self._execute() 41445 1727204189.35687: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204189.35691: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204189.35700: variable 'omit' from source: magic vars 41445 1727204189.36029: variable 'ansible_distribution_major_version' from source: facts 41445 1727204189.36033: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204189.36108: variable 'type' from source: set_fact 41445 1727204189.36114: variable 'state' from source: include params 41445 1727204189.36117: variable 'interface' from source: set_fact 41445 1727204189.36119: variable 'current_interfaces' from source: set_fact 41445 1727204189.36126: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 41445 1727204189.36130: when evaluation is False, skipping this task 41445 1727204189.36133: _execute() done 41445 1727204189.36136: dumping result to json 41445 1727204189.36138: done dumping result, returning 41445 1727204189.36140: done running TaskExecutor() for managed-node3/TASK: Create tap interface ethtest0 [028d2410-947f-bf02-eee4-000000000273] 41445 1727204189.36148: sending task result for task 028d2410-947f-bf02-eee4-000000000273 41445 1727204189.36229: done sending task result for task 028d2410-947f-bf02-eee4-000000000273 41445 1727204189.36232: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 41445 1727204189.36296: no more pending results, returning what we have 41445 1727204189.36299: results queue empty 41445 1727204189.36300: checking for any_errors_fatal 41445 1727204189.36307: done checking for any_errors_fatal 41445 1727204189.36308: checking for max_fail_percentage 41445 1727204189.36311: done checking for max_fail_percentage 41445 1727204189.36312: checking to see if all hosts have failed and the running result is not ok 41445 1727204189.36313: done checking to see if all hosts have failed 41445 1727204189.36313: getting the remaining hosts for this loop 41445 1727204189.36314: done getting the remaining hosts for this loop 41445 1727204189.36318: getting the next task for host managed-node3 41445 1727204189.36323: done getting next task for host managed-node3 41445 1727204189.36325: ^ task is: TASK: Delete tap interface {{ interface }} 41445 1727204189.36328: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204189.36331: getting variables 41445 1727204189.36332: in VariableManager get_vars() 41445 1727204189.36367: Calling all_inventory to load vars for managed-node3 41445 1727204189.36369: Calling groups_inventory to load vars for managed-node3 41445 1727204189.36371: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204189.36382: Calling all_plugins_play to load vars for managed-node3 41445 1727204189.36385: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204189.36387: Calling groups_plugins_play to load vars for managed-node3 41445 1727204189.36515: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204189.36644: done with get_vars() 41445 1727204189.36652: done getting variables 41445 1727204189.36707: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 41445 1727204189.36806: variable 'interface' from source: set_fact TASK [Delete tap interface ethtest0] ******************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Tuesday 24 September 2024 14:56:29 -0400 (0:00:00.017) 0:00:08.156 ***** 41445 1727204189.36835: entering _queue_task() for managed-node3/command 41445 1727204189.37077: worker is 1 (out of 1 available) 41445 1727204189.37090: exiting _queue_task() for managed-node3/command 41445 1727204189.37104: done queuing things up, now waiting for results queue to drain 41445 1727204189.37105: waiting for pending results... 41445 1727204189.37393: running TaskExecutor() for managed-node3/TASK: Delete tap interface ethtest0 41445 1727204189.37398: in run() - task 028d2410-947f-bf02-eee4-000000000274 41445 1727204189.37403: variable 'ansible_search_path' from source: unknown 41445 1727204189.37409: variable 'ansible_search_path' from source: unknown 41445 1727204189.37452: calling self._execute() 41445 1727204189.37540: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204189.37550: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204189.37561: variable 'omit' from source: magic vars 41445 1727204189.37885: variable 'ansible_distribution_major_version' from source: facts 41445 1727204189.37992: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204189.38220: variable 'type' from source: set_fact 41445 1727204189.38225: variable 'state' from source: include params 41445 1727204189.38230: variable 'interface' from source: set_fact 41445 1727204189.38232: variable 'current_interfaces' from source: set_fact 41445 1727204189.38234: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 41445 1727204189.38235: when evaluation is False, skipping this task 41445 1727204189.38237: _execute() done 41445 1727204189.38239: dumping result to json 41445 1727204189.38240: done dumping result, returning 41445 1727204189.38242: done running TaskExecutor() for managed-node3/TASK: Delete tap interface ethtest0 [028d2410-947f-bf02-eee4-000000000274] 41445 1727204189.38244: sending task result for task 028d2410-947f-bf02-eee4-000000000274 41445 1727204189.38322: done sending task result for task 028d2410-947f-bf02-eee4-000000000274 41445 1727204189.38326: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 41445 1727204189.38403: no more pending results, returning what we have 41445 1727204189.38408: results queue empty 41445 1727204189.38412: checking for any_errors_fatal 41445 1727204189.38422: done checking for any_errors_fatal 41445 1727204189.38423: checking for max_fail_percentage 41445 1727204189.38424: done checking for max_fail_percentage 41445 1727204189.38426: checking to see if all hosts have failed and the running result is not ok 41445 1727204189.38426: done checking to see if all hosts have failed 41445 1727204189.38427: getting the remaining hosts for this loop 41445 1727204189.38429: done getting the remaining hosts for this loop 41445 1727204189.38432: getting the next task for host managed-node3 41445 1727204189.38440: done getting next task for host managed-node3 41445 1727204189.38443: ^ task is: TASK: Include the task 'assert_device_present.yml' 41445 1727204189.38446: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204189.38450: getting variables 41445 1727204189.38451: in VariableManager get_vars() 41445 1727204189.38502: Calling all_inventory to load vars for managed-node3 41445 1727204189.38505: Calling groups_inventory to load vars for managed-node3 41445 1727204189.38507: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204189.38520: Calling all_plugins_play to load vars for managed-node3 41445 1727204189.38523: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204189.38525: Calling groups_plugins_play to load vars for managed-node3 41445 1727204189.38779: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204189.39067: done with get_vars() 41445 1727204189.39074: done getting variables TASK [Include the task 'assert_device_present.yml'] **************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:21 Tuesday 24 September 2024 14:56:29 -0400 (0:00:00.023) 0:00:08.179 ***** 41445 1727204189.39162: entering _queue_task() for managed-node3/include_tasks 41445 1727204189.39371: worker is 1 (out of 1 available) 41445 1727204189.39386: exiting _queue_task() for managed-node3/include_tasks 41445 1727204189.39399: done queuing things up, now waiting for results queue to drain 41445 1727204189.39400: waiting for pending results... 41445 1727204189.39556: running TaskExecutor() for managed-node3/TASK: Include the task 'assert_device_present.yml' 41445 1727204189.39614: in run() - task 028d2410-947f-bf02-eee4-00000000000e 41445 1727204189.39632: variable 'ansible_search_path' from source: unknown 41445 1727204189.39658: calling self._execute() 41445 1727204189.39726: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204189.39736: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204189.39740: variable 'omit' from source: magic vars 41445 1727204189.40012: variable 'ansible_distribution_major_version' from source: facts 41445 1727204189.40025: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204189.40030: _execute() done 41445 1727204189.40033: dumping result to json 41445 1727204189.40037: done dumping result, returning 41445 1727204189.40042: done running TaskExecutor() for managed-node3/TASK: Include the task 'assert_device_present.yml' [028d2410-947f-bf02-eee4-00000000000e] 41445 1727204189.40049: sending task result for task 028d2410-947f-bf02-eee4-00000000000e 41445 1727204189.40136: done sending task result for task 028d2410-947f-bf02-eee4-00000000000e 41445 1727204189.40138: WORKER PROCESS EXITING 41445 1727204189.40200: no more pending results, returning what we have 41445 1727204189.40205: in VariableManager get_vars() 41445 1727204189.40243: Calling all_inventory to load vars for managed-node3 41445 1727204189.40246: Calling groups_inventory to load vars for managed-node3 41445 1727204189.40248: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204189.40257: Calling all_plugins_play to load vars for managed-node3 41445 1727204189.40259: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204189.40262: Calling groups_plugins_play to load vars for managed-node3 41445 1727204189.40392: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204189.40514: done with get_vars() 41445 1727204189.40520: variable 'ansible_search_path' from source: unknown 41445 1727204189.40529: we have included files to process 41445 1727204189.40530: generating all_blocks data 41445 1727204189.40531: done generating all_blocks data 41445 1727204189.40534: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 41445 1727204189.40535: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 41445 1727204189.40537: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 41445 1727204189.40643: in VariableManager get_vars() 41445 1727204189.40658: done with get_vars() 41445 1727204189.40734: done processing included file 41445 1727204189.40735: iterating over new_blocks loaded from include file 41445 1727204189.40736: in VariableManager get_vars() 41445 1727204189.40747: done with get_vars() 41445 1727204189.40748: filtering new block on tags 41445 1727204189.40760: done filtering new block on tags 41445 1727204189.40761: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed-node3 41445 1727204189.40790: extending task lists for all hosts with included blocks 41445 1727204189.42586: done extending task lists 41445 1727204189.42588: done processing included files 41445 1727204189.42588: results queue empty 41445 1727204189.42589: checking for any_errors_fatal 41445 1727204189.42591: done checking for any_errors_fatal 41445 1727204189.42591: checking for max_fail_percentage 41445 1727204189.42592: done checking for max_fail_percentage 41445 1727204189.42593: checking to see if all hosts have failed and the running result is not ok 41445 1727204189.42594: done checking to see if all hosts have failed 41445 1727204189.42594: getting the remaining hosts for this loop 41445 1727204189.42595: done getting the remaining hosts for this loop 41445 1727204189.42597: getting the next task for host managed-node3 41445 1727204189.42600: done getting next task for host managed-node3 41445 1727204189.42601: ^ task is: TASK: Include the task 'get_interface_stat.yml' 41445 1727204189.42603: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204189.42605: getting variables 41445 1727204189.42605: in VariableManager get_vars() 41445 1727204189.42618: Calling all_inventory to load vars for managed-node3 41445 1727204189.42620: Calling groups_inventory to load vars for managed-node3 41445 1727204189.42621: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204189.42626: Calling all_plugins_play to load vars for managed-node3 41445 1727204189.42627: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204189.42629: Calling groups_plugins_play to load vars for managed-node3 41445 1727204189.42724: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204189.42857: done with get_vars() 41445 1727204189.42864: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 14:56:29 -0400 (0:00:00.037) 0:00:08.216 ***** 41445 1727204189.42917: entering _queue_task() for managed-node3/include_tasks 41445 1727204189.43141: worker is 1 (out of 1 available) 41445 1727204189.43156: exiting _queue_task() for managed-node3/include_tasks 41445 1727204189.43170: done queuing things up, now waiting for results queue to drain 41445 1727204189.43171: waiting for pending results... 41445 1727204189.43339: running TaskExecutor() for managed-node3/TASK: Include the task 'get_interface_stat.yml' 41445 1727204189.43401: in run() - task 028d2410-947f-bf02-eee4-0000000003e0 41445 1727204189.43414: variable 'ansible_search_path' from source: unknown 41445 1727204189.43417: variable 'ansible_search_path' from source: unknown 41445 1727204189.43443: calling self._execute() 41445 1727204189.43509: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204189.43517: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204189.43530: variable 'omit' from source: magic vars 41445 1727204189.43818: variable 'ansible_distribution_major_version' from source: facts 41445 1727204189.43827: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204189.43834: _execute() done 41445 1727204189.43836: dumping result to json 41445 1727204189.43841: done dumping result, returning 41445 1727204189.43854: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_interface_stat.yml' [028d2410-947f-bf02-eee4-0000000003e0] 41445 1727204189.43857: sending task result for task 028d2410-947f-bf02-eee4-0000000003e0 41445 1727204189.43942: done sending task result for task 028d2410-947f-bf02-eee4-0000000003e0 41445 1727204189.43944: WORKER PROCESS EXITING 41445 1727204189.43980: no more pending results, returning what we have 41445 1727204189.43984: in VariableManager get_vars() 41445 1727204189.44032: Calling all_inventory to load vars for managed-node3 41445 1727204189.44035: Calling groups_inventory to load vars for managed-node3 41445 1727204189.44037: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204189.44049: Calling all_plugins_play to load vars for managed-node3 41445 1727204189.44052: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204189.44055: Calling groups_plugins_play to load vars for managed-node3 41445 1727204189.44201: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204189.44326: done with get_vars() 41445 1727204189.44332: variable 'ansible_search_path' from source: unknown 41445 1727204189.44333: variable 'ansible_search_path' from source: unknown 41445 1727204189.44358: we have included files to process 41445 1727204189.44359: generating all_blocks data 41445 1727204189.44361: done generating all_blocks data 41445 1727204189.44362: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 41445 1727204189.44362: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 41445 1727204189.44364: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 41445 1727204189.44523: done processing included file 41445 1727204189.44525: iterating over new_blocks loaded from include file 41445 1727204189.44526: in VariableManager get_vars() 41445 1727204189.44538: done with get_vars() 41445 1727204189.44539: filtering new block on tags 41445 1727204189.44548: done filtering new block on tags 41445 1727204189.44549: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node3 41445 1727204189.44552: extending task lists for all hosts with included blocks 41445 1727204189.44612: done extending task lists 41445 1727204189.44613: done processing included files 41445 1727204189.44614: results queue empty 41445 1727204189.44614: checking for any_errors_fatal 41445 1727204189.44616: done checking for any_errors_fatal 41445 1727204189.44617: checking for max_fail_percentage 41445 1727204189.44618: done checking for max_fail_percentage 41445 1727204189.44618: checking to see if all hosts have failed and the running result is not ok 41445 1727204189.44619: done checking to see if all hosts have failed 41445 1727204189.44619: getting the remaining hosts for this loop 41445 1727204189.44620: done getting the remaining hosts for this loop 41445 1727204189.44621: getting the next task for host managed-node3 41445 1727204189.44624: done getting next task for host managed-node3 41445 1727204189.44625: ^ task is: TASK: Get stat for interface {{ interface }} 41445 1727204189.44628: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204189.44629: getting variables 41445 1727204189.44630: in VariableManager get_vars() 41445 1727204189.44638: Calling all_inventory to load vars for managed-node3 41445 1727204189.44639: Calling groups_inventory to load vars for managed-node3 41445 1727204189.44640: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204189.44644: Calling all_plugins_play to load vars for managed-node3 41445 1727204189.44646: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204189.44647: Calling groups_plugins_play to load vars for managed-node3 41445 1727204189.44758: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204189.44877: done with get_vars() 41445 1727204189.44884: done getting variables 41445 1727204189.44996: variable 'interface' from source: set_fact TASK [Get stat for interface ethtest0] ***************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 14:56:29 -0400 (0:00:00.021) 0:00:08.237 ***** 41445 1727204189.45021: entering _queue_task() for managed-node3/stat 41445 1727204189.45235: worker is 1 (out of 1 available) 41445 1727204189.45250: exiting _queue_task() for managed-node3/stat 41445 1727204189.45260: done queuing things up, now waiting for results queue to drain 41445 1727204189.45262: waiting for pending results... 41445 1727204189.45592: running TaskExecutor() for managed-node3/TASK: Get stat for interface ethtest0 41445 1727204189.45597: in run() - task 028d2410-947f-bf02-eee4-0000000004ff 41445 1727204189.45600: variable 'ansible_search_path' from source: unknown 41445 1727204189.45603: variable 'ansible_search_path' from source: unknown 41445 1727204189.45606: calling self._execute() 41445 1727204189.45684: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204189.45696: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204189.45712: variable 'omit' from source: magic vars 41445 1727204189.46056: variable 'ansible_distribution_major_version' from source: facts 41445 1727204189.46077: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204189.46089: variable 'omit' from source: magic vars 41445 1727204189.46134: variable 'omit' from source: magic vars 41445 1727204189.46230: variable 'interface' from source: set_fact 41445 1727204189.46253: variable 'omit' from source: magic vars 41445 1727204189.46302: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41445 1727204189.46342: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41445 1727204189.46364: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41445 1727204189.46391: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204189.46409: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204189.46443: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41445 1727204189.46452: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204189.46460: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204189.46553: Set connection var ansible_shell_executable to /bin/sh 41445 1727204189.46562: Set connection var ansible_shell_type to sh 41445 1727204189.46578: Set connection var ansible_pipelining to False 41445 1727204189.46589: Set connection var ansible_timeout to 10 41445 1727204189.46592: Set connection var ansible_connection to ssh 41445 1727204189.46598: Set connection var ansible_module_compression to ZIP_DEFLATED 41445 1727204189.46620: variable 'ansible_shell_executable' from source: unknown 41445 1727204189.46629: variable 'ansible_connection' from source: unknown 41445 1727204189.46632: variable 'ansible_module_compression' from source: unknown 41445 1727204189.46634: variable 'ansible_shell_type' from source: unknown 41445 1727204189.46637: variable 'ansible_shell_executable' from source: unknown 41445 1727204189.46639: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204189.46643: variable 'ansible_pipelining' from source: unknown 41445 1727204189.46645: variable 'ansible_timeout' from source: unknown 41445 1727204189.46649: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204189.46801: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 41445 1727204189.46811: variable 'omit' from source: magic vars 41445 1727204189.46814: starting attempt loop 41445 1727204189.46817: running the handler 41445 1727204189.46829: _low_level_execute_command(): starting 41445 1727204189.46845: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41445 1727204189.47338: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204189.47343: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204189.47346: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204189.47382: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204189.47397: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204189.47440: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204189.49246: stdout chunk (state=3): >>>/root <<< 41445 1727204189.49249: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204189.49251: stdout chunk (state=3): >>><<< 41445 1727204189.49253: stderr chunk (state=3): >>><<< 41445 1727204189.49256: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204189.49278: _low_level_execute_command(): starting 41445 1727204189.49288: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204189.4926262-42440-22752316330983 `" && echo ansible-tmp-1727204189.4926262-42440-22752316330983="` echo /root/.ansible/tmp/ansible-tmp-1727204189.4926262-42440-22752316330983 `" ) && sleep 0' 41445 1727204189.49853: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204189.49856: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found <<< 41445 1727204189.49858: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204189.49861: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204189.49869: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204189.49928: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204189.49931: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204189.50065: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204189.50104: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204189.51950: stdout chunk (state=3): >>>ansible-tmp-1727204189.4926262-42440-22752316330983=/root/.ansible/tmp/ansible-tmp-1727204189.4926262-42440-22752316330983 <<< 41445 1727204189.52086: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204189.52090: stdout chunk (state=3): >>><<< 41445 1727204189.52095: stderr chunk (state=3): >>><<< 41445 1727204189.52183: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204189.4926262-42440-22752316330983=/root/.ansible/tmp/ansible-tmp-1727204189.4926262-42440-22752316330983 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204189.52187: variable 'ansible_module_compression' from source: unknown 41445 1727204189.52189: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-414450s0ylvj1/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 41445 1727204189.52222: variable 'ansible_facts' from source: unknown 41445 1727204189.52281: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204189.4926262-42440-22752316330983/AnsiballZ_stat.py 41445 1727204189.52381: Sending initial data 41445 1727204189.52384: Sent initial data (152 bytes) 41445 1727204189.52806: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204189.52812: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found <<< 41445 1727204189.52815: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204189.52817: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204189.52819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204189.52862: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204189.52865: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204189.52907: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204189.54391: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 41445 1727204189.54408: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41445 1727204189.54437: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41445 1727204189.54464: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-414450s0ylvj1/tmp3l8_8nn0 /root/.ansible/tmp/ansible-tmp-1727204189.4926262-42440-22752316330983/AnsiballZ_stat.py <<< 41445 1727204189.54472: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204189.4926262-42440-22752316330983/AnsiballZ_stat.py" <<< 41445 1727204189.54496: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-414450s0ylvj1/tmp3l8_8nn0" to remote "/root/.ansible/tmp/ansible-tmp-1727204189.4926262-42440-22752316330983/AnsiballZ_stat.py" <<< 41445 1727204189.54503: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204189.4926262-42440-22752316330983/AnsiballZ_stat.py" <<< 41445 1727204189.55013: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204189.55041: stderr chunk (state=3): >>><<< 41445 1727204189.55044: stdout chunk (state=3): >>><<< 41445 1727204189.55061: done transferring module to remote 41445 1727204189.55069: _low_level_execute_command(): starting 41445 1727204189.55072: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204189.4926262-42440-22752316330983/ /root/.ansible/tmp/ansible-tmp-1727204189.4926262-42440-22752316330983/AnsiballZ_stat.py && sleep 0' 41445 1727204189.55483: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204189.55486: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found <<< 41445 1727204189.55489: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204189.55491: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204189.55493: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204189.55541: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204189.55544: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204189.55583: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204189.57422: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204189.57425: stdout chunk (state=3): >>><<< 41445 1727204189.57428: stderr chunk (state=3): >>><<< 41445 1727204189.57430: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204189.57434: _low_level_execute_command(): starting 41445 1727204189.57436: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204189.4926262-42440-22752316330983/AnsiballZ_stat.py && sleep 0' 41445 1727204189.57852: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204189.57856: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204189.57858: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address <<< 41445 1727204189.57860: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204189.57862: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204189.57913: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204189.57918: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204189.57956: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204189.72999: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/ethtest0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 32958, "dev": 23, "nlink": 1, "atime": 1727204188.1269014, "mtime": 1727204188.1269014, "ctime": 1727204188.1269014, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/ethtest0", "lnk_target": "../../devices/virtual/net/ethtest0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 41445 1727204189.74207: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. <<< 41445 1727204189.74234: stderr chunk (state=3): >>><<< 41445 1727204189.74237: stdout chunk (state=3): >>><<< 41445 1727204189.74253: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/ethtest0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 32958, "dev": 23, "nlink": 1, "atime": 1727204188.1269014, "mtime": 1727204188.1269014, "ctime": 1727204188.1269014, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/ethtest0", "lnk_target": "../../devices/virtual/net/ethtest0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. 41445 1727204189.74292: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204189.4926262-42440-22752316330983/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41445 1727204189.74306: _low_level_execute_command(): starting 41445 1727204189.74312: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204189.4926262-42440-22752316330983/ > /dev/null 2>&1 && sleep 0' 41445 1727204189.74770: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204189.74773: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204189.74778: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204189.74780: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204189.74874: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204189.74890: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204189.74914: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204189.76705: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204189.76735: stderr chunk (state=3): >>><<< 41445 1727204189.76738: stdout chunk (state=3): >>><<< 41445 1727204189.76752: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204189.76758: handler run complete 41445 1727204189.76790: attempt loop complete, returning result 41445 1727204189.76793: _execute() done 41445 1727204189.76799: dumping result to json 41445 1727204189.76801: done dumping result, returning 41445 1727204189.76812: done running TaskExecutor() for managed-node3/TASK: Get stat for interface ethtest0 [028d2410-947f-bf02-eee4-0000000004ff] 41445 1727204189.76815: sending task result for task 028d2410-947f-bf02-eee4-0000000004ff 41445 1727204189.76929: done sending task result for task 028d2410-947f-bf02-eee4-0000000004ff 41445 1727204189.76932: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "stat": { "atime": 1727204188.1269014, "block_size": 4096, "blocks": 0, "ctime": 1727204188.1269014, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 32958, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/ethtest0", "lnk_target": "../../devices/virtual/net/ethtest0", "mode": "0777", "mtime": 1727204188.1269014, "nlink": 1, "path": "/sys/class/net/ethtest0", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 41445 1727204189.77017: no more pending results, returning what we have 41445 1727204189.77021: results queue empty 41445 1727204189.77021: checking for any_errors_fatal 41445 1727204189.77023: done checking for any_errors_fatal 41445 1727204189.77023: checking for max_fail_percentage 41445 1727204189.77025: done checking for max_fail_percentage 41445 1727204189.77026: checking to see if all hosts have failed and the running result is not ok 41445 1727204189.77027: done checking to see if all hosts have failed 41445 1727204189.77027: getting the remaining hosts for this loop 41445 1727204189.77029: done getting the remaining hosts for this loop 41445 1727204189.77032: getting the next task for host managed-node3 41445 1727204189.77039: done getting next task for host managed-node3 41445 1727204189.77041: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 41445 1727204189.77044: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204189.77048: getting variables 41445 1727204189.77049: in VariableManager get_vars() 41445 1727204189.77087: Calling all_inventory to load vars for managed-node3 41445 1727204189.77090: Calling groups_inventory to load vars for managed-node3 41445 1727204189.77093: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204189.77103: Calling all_plugins_play to load vars for managed-node3 41445 1727204189.77105: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204189.77108: Calling groups_plugins_play to load vars for managed-node3 41445 1727204189.77249: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204189.77421: done with get_vars() 41445 1727204189.77429: done getting variables 41445 1727204189.77528: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 41445 1727204189.77646: variable 'interface' from source: set_fact TASK [Assert that the interface is present - 'ethtest0'] *********************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 14:56:29 -0400 (0:00:00.326) 0:00:08.564 ***** 41445 1727204189.77674: entering _queue_task() for managed-node3/assert 41445 1727204189.77678: Creating lock for assert 41445 1727204189.77968: worker is 1 (out of 1 available) 41445 1727204189.78184: exiting _queue_task() for managed-node3/assert 41445 1727204189.78194: done queuing things up, now waiting for results queue to drain 41445 1727204189.78196: waiting for pending results... 41445 1727204189.78327: running TaskExecutor() for managed-node3/TASK: Assert that the interface is present - 'ethtest0' 41445 1727204189.78382: in run() - task 028d2410-947f-bf02-eee4-0000000003e1 41445 1727204189.78423: variable 'ansible_search_path' from source: unknown 41445 1727204189.78426: variable 'ansible_search_path' from source: unknown 41445 1727204189.78452: calling self._execute() 41445 1727204189.78548: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204189.78581: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204189.78585: variable 'omit' from source: magic vars 41445 1727204189.78952: variable 'ansible_distribution_major_version' from source: facts 41445 1727204189.78971: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204189.78979: variable 'omit' from source: magic vars 41445 1727204189.79019: variable 'omit' from source: magic vars 41445 1727204189.79091: variable 'interface' from source: set_fact 41445 1727204189.79105: variable 'omit' from source: magic vars 41445 1727204189.79150: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41445 1727204189.79178: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41445 1727204189.79193: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41445 1727204189.79207: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204189.79219: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204189.79241: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41445 1727204189.79244: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204189.79249: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204189.79321: Set connection var ansible_shell_executable to /bin/sh 41445 1727204189.79324: Set connection var ansible_shell_type to sh 41445 1727204189.79327: Set connection var ansible_pipelining to False 41445 1727204189.79335: Set connection var ansible_timeout to 10 41445 1727204189.79337: Set connection var ansible_connection to ssh 41445 1727204189.79343: Set connection var ansible_module_compression to ZIP_DEFLATED 41445 1727204189.79361: variable 'ansible_shell_executable' from source: unknown 41445 1727204189.79370: variable 'ansible_connection' from source: unknown 41445 1727204189.79373: variable 'ansible_module_compression' from source: unknown 41445 1727204189.79377: variable 'ansible_shell_type' from source: unknown 41445 1727204189.79379: variable 'ansible_shell_executable' from source: unknown 41445 1727204189.79381: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204189.79385: variable 'ansible_pipelining' from source: unknown 41445 1727204189.79387: variable 'ansible_timeout' from source: unknown 41445 1727204189.79391: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204189.79497: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41445 1727204189.79507: variable 'omit' from source: magic vars 41445 1727204189.79510: starting attempt loop 41445 1727204189.79516: running the handler 41445 1727204189.79600: variable 'interface_stat' from source: set_fact 41445 1727204189.79693: Evaluated conditional (interface_stat.stat.exists): True 41445 1727204189.79696: handler run complete 41445 1727204189.79698: attempt loop complete, returning result 41445 1727204189.79699: _execute() done 41445 1727204189.79701: dumping result to json 41445 1727204189.79703: done dumping result, returning 41445 1727204189.79705: done running TaskExecutor() for managed-node3/TASK: Assert that the interface is present - 'ethtest0' [028d2410-947f-bf02-eee4-0000000003e1] 41445 1727204189.79707: sending task result for task 028d2410-947f-bf02-eee4-0000000003e1 41445 1727204189.79763: done sending task result for task 028d2410-947f-bf02-eee4-0000000003e1 41445 1727204189.79766: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 41445 1727204189.79815: no more pending results, returning what we have 41445 1727204189.79818: results queue empty 41445 1727204189.79819: checking for any_errors_fatal 41445 1727204189.79827: done checking for any_errors_fatal 41445 1727204189.79828: checking for max_fail_percentage 41445 1727204189.79829: done checking for max_fail_percentage 41445 1727204189.79830: checking to see if all hosts have failed and the running result is not ok 41445 1727204189.79831: done checking to see if all hosts have failed 41445 1727204189.79832: getting the remaining hosts for this loop 41445 1727204189.79833: done getting the remaining hosts for this loop 41445 1727204189.79836: getting the next task for host managed-node3 41445 1727204189.79842: done getting next task for host managed-node3 41445 1727204189.79846: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 41445 1727204189.79848: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204189.79861: getting variables 41445 1727204189.79862: in VariableManager get_vars() 41445 1727204189.79898: Calling all_inventory to load vars for managed-node3 41445 1727204189.79900: Calling groups_inventory to load vars for managed-node3 41445 1727204189.79903: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204189.79912: Calling all_plugins_play to load vars for managed-node3 41445 1727204189.79914: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204189.79917: Calling groups_plugins_play to load vars for managed-node3 41445 1727204189.80031: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204189.80162: done with get_vars() 41445 1727204189.80170: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:56:29 -0400 (0:00:00.025) 0:00:08.590 ***** 41445 1727204189.80235: entering _queue_task() for managed-node3/include_tasks 41445 1727204189.80425: worker is 1 (out of 1 available) 41445 1727204189.80438: exiting _queue_task() for managed-node3/include_tasks 41445 1727204189.80449: done queuing things up, now waiting for results queue to drain 41445 1727204189.80451: waiting for pending results... 41445 1727204189.80643: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 41445 1727204189.80743: in run() - task 028d2410-947f-bf02-eee4-000000000016 41445 1727204189.80754: variable 'ansible_search_path' from source: unknown 41445 1727204189.80758: variable 'ansible_search_path' from source: unknown 41445 1727204189.80808: calling self._execute() 41445 1727204189.80872: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204189.81081: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204189.81085: variable 'omit' from source: magic vars 41445 1727204189.81228: variable 'ansible_distribution_major_version' from source: facts 41445 1727204189.81245: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204189.81255: _execute() done 41445 1727204189.81262: dumping result to json 41445 1727204189.81268: done dumping result, returning 41445 1727204189.81279: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [028d2410-947f-bf02-eee4-000000000016] 41445 1727204189.81290: sending task result for task 028d2410-947f-bf02-eee4-000000000016 41445 1727204189.81398: done sending task result for task 028d2410-947f-bf02-eee4-000000000016 41445 1727204189.81406: WORKER PROCESS EXITING 41445 1727204189.81449: no more pending results, returning what we have 41445 1727204189.81454: in VariableManager get_vars() 41445 1727204189.81503: Calling all_inventory to load vars for managed-node3 41445 1727204189.81506: Calling groups_inventory to load vars for managed-node3 41445 1727204189.81511: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204189.81524: Calling all_plugins_play to load vars for managed-node3 41445 1727204189.81526: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204189.81529: Calling groups_plugins_play to load vars for managed-node3 41445 1727204189.81759: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204189.81991: done with get_vars() 41445 1727204189.81999: variable 'ansible_search_path' from source: unknown 41445 1727204189.82000: variable 'ansible_search_path' from source: unknown 41445 1727204189.82040: we have included files to process 41445 1727204189.82041: generating all_blocks data 41445 1727204189.82043: done generating all_blocks data 41445 1727204189.82047: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 41445 1727204189.82048: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 41445 1727204189.82051: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 41445 1727204189.82747: done processing included file 41445 1727204189.82749: iterating over new_blocks loaded from include file 41445 1727204189.82750: in VariableManager get_vars() 41445 1727204189.82765: done with get_vars() 41445 1727204189.82766: filtering new block on tags 41445 1727204189.82786: done filtering new block on tags 41445 1727204189.82789: in VariableManager get_vars() 41445 1727204189.82814: done with get_vars() 41445 1727204189.82816: filtering new block on tags 41445 1727204189.82829: done filtering new block on tags 41445 1727204189.82830: in VariableManager get_vars() 41445 1727204189.82844: done with get_vars() 41445 1727204189.82845: filtering new block on tags 41445 1727204189.82857: done filtering new block on tags 41445 1727204189.82858: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node3 41445 1727204189.82861: extending task lists for all hosts with included blocks 41445 1727204189.83297: done extending task lists 41445 1727204189.83299: done processing included files 41445 1727204189.83299: results queue empty 41445 1727204189.83299: checking for any_errors_fatal 41445 1727204189.83301: done checking for any_errors_fatal 41445 1727204189.83302: checking for max_fail_percentage 41445 1727204189.83302: done checking for max_fail_percentage 41445 1727204189.83303: checking to see if all hosts have failed and the running result is not ok 41445 1727204189.83303: done checking to see if all hosts have failed 41445 1727204189.83304: getting the remaining hosts for this loop 41445 1727204189.83305: done getting the remaining hosts for this loop 41445 1727204189.83306: getting the next task for host managed-node3 41445 1727204189.83309: done getting next task for host managed-node3 41445 1727204189.83311: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 41445 1727204189.83313: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204189.83320: getting variables 41445 1727204189.83321: in VariableManager get_vars() 41445 1727204189.83514: Calling all_inventory to load vars for managed-node3 41445 1727204189.83516: Calling groups_inventory to load vars for managed-node3 41445 1727204189.83518: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204189.83521: Calling all_plugins_play to load vars for managed-node3 41445 1727204189.83523: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204189.83525: Calling groups_plugins_play to load vars for managed-node3 41445 1727204189.83612: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204189.83731: done with get_vars() 41445 1727204189.83737: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:56:29 -0400 (0:00:00.035) 0:00:08.625 ***** 41445 1727204189.83789: entering _queue_task() for managed-node3/setup 41445 1727204189.83993: worker is 1 (out of 1 available) 41445 1727204189.84007: exiting _queue_task() for managed-node3/setup 41445 1727204189.84017: done queuing things up, now waiting for results queue to drain 41445 1727204189.84019: waiting for pending results... 41445 1727204189.84185: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 41445 1727204189.84381: in run() - task 028d2410-947f-bf02-eee4-000000000517 41445 1727204189.84385: variable 'ansible_search_path' from source: unknown 41445 1727204189.84388: variable 'ansible_search_path' from source: unknown 41445 1727204189.84391: calling self._execute() 41445 1727204189.84493: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204189.84535: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204189.84608: variable 'omit' from source: magic vars 41445 1727204189.85414: variable 'ansible_distribution_major_version' from source: facts 41445 1727204189.85487: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204189.85737: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41445 1727204189.87261: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41445 1727204189.87308: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41445 1727204189.87335: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41445 1727204189.87360: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41445 1727204189.87384: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41445 1727204189.87530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204189.87534: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204189.87537: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204189.87555: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204189.87558: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204189.87630: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204189.87633: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204189.87662: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204189.87747: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204189.87751: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204189.88083: variable '__network_required_facts' from source: role '' defaults 41445 1727204189.88085: variable 'ansible_facts' from source: unknown 41445 1727204189.88088: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 41445 1727204189.88090: when evaluation is False, skipping this task 41445 1727204189.88092: _execute() done 41445 1727204189.88093: dumping result to json 41445 1727204189.88095: done dumping result, returning 41445 1727204189.88097: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [028d2410-947f-bf02-eee4-000000000517] 41445 1727204189.88099: sending task result for task 028d2410-947f-bf02-eee4-000000000517 41445 1727204189.88163: done sending task result for task 028d2410-947f-bf02-eee4-000000000517 41445 1727204189.88166: WORKER PROCESS EXITING skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 41445 1727204189.88210: no more pending results, returning what we have 41445 1727204189.88213: results queue empty 41445 1727204189.88214: checking for any_errors_fatal 41445 1727204189.88216: done checking for any_errors_fatal 41445 1727204189.88216: checking for max_fail_percentage 41445 1727204189.88218: done checking for max_fail_percentage 41445 1727204189.88219: checking to see if all hosts have failed and the running result is not ok 41445 1727204189.88219: done checking to see if all hosts have failed 41445 1727204189.88220: getting the remaining hosts for this loop 41445 1727204189.88221: done getting the remaining hosts for this loop 41445 1727204189.88224: getting the next task for host managed-node3 41445 1727204189.88231: done getting next task for host managed-node3 41445 1727204189.88235: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 41445 1727204189.88238: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204189.88250: getting variables 41445 1727204189.88251: in VariableManager get_vars() 41445 1727204189.88289: Calling all_inventory to load vars for managed-node3 41445 1727204189.88292: Calling groups_inventory to load vars for managed-node3 41445 1727204189.88294: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204189.88304: Calling all_plugins_play to load vars for managed-node3 41445 1727204189.88306: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204189.88311: Calling groups_plugins_play to load vars for managed-node3 41445 1727204189.88615: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204189.88887: done with get_vars() 41445 1727204189.88898: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:56:29 -0400 (0:00:00.051) 0:00:08.677 ***** 41445 1727204189.88997: entering _queue_task() for managed-node3/stat 41445 1727204189.89238: worker is 1 (out of 1 available) 41445 1727204189.89251: exiting _queue_task() for managed-node3/stat 41445 1727204189.89262: done queuing things up, now waiting for results queue to drain 41445 1727204189.89263: waiting for pending results... 41445 1727204189.89693: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 41445 1727204189.89714: in run() - task 028d2410-947f-bf02-eee4-000000000519 41445 1727204189.89736: variable 'ansible_search_path' from source: unknown 41445 1727204189.89744: variable 'ansible_search_path' from source: unknown 41445 1727204189.89787: calling self._execute() 41445 1727204189.89882: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204189.89894: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204189.89910: variable 'omit' from source: magic vars 41445 1727204189.90307: variable 'ansible_distribution_major_version' from source: facts 41445 1727204189.90324: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204189.90499: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41445 1727204189.90769: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41445 1727204189.90820: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41445 1727204189.90857: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41445 1727204189.90902: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41445 1727204189.90990: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41445 1727204189.91081: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41445 1727204189.91085: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204189.91087: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41445 1727204189.91182: variable '__network_is_ostree' from source: set_fact 41445 1727204189.91193: Evaluated conditional (not __network_is_ostree is defined): False 41445 1727204189.91198: when evaluation is False, skipping this task 41445 1727204189.91203: _execute() done 41445 1727204189.91208: dumping result to json 41445 1727204189.91214: done dumping result, returning 41445 1727204189.91230: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [028d2410-947f-bf02-eee4-000000000519] 41445 1727204189.91241: sending task result for task 028d2410-947f-bf02-eee4-000000000519 41445 1727204189.91581: done sending task result for task 028d2410-947f-bf02-eee4-000000000519 41445 1727204189.91584: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 41445 1727204189.91626: no more pending results, returning what we have 41445 1727204189.91629: results queue empty 41445 1727204189.91630: checking for any_errors_fatal 41445 1727204189.91635: done checking for any_errors_fatal 41445 1727204189.91636: checking for max_fail_percentage 41445 1727204189.91638: done checking for max_fail_percentage 41445 1727204189.91638: checking to see if all hosts have failed and the running result is not ok 41445 1727204189.91640: done checking to see if all hosts have failed 41445 1727204189.91640: getting the remaining hosts for this loop 41445 1727204189.91641: done getting the remaining hosts for this loop 41445 1727204189.91644: getting the next task for host managed-node3 41445 1727204189.91650: done getting next task for host managed-node3 41445 1727204189.91653: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 41445 1727204189.91657: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204189.91670: getting variables 41445 1727204189.91671: in VariableManager get_vars() 41445 1727204189.91709: Calling all_inventory to load vars for managed-node3 41445 1727204189.91712: Calling groups_inventory to load vars for managed-node3 41445 1727204189.91714: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204189.91722: Calling all_plugins_play to load vars for managed-node3 41445 1727204189.91725: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204189.91728: Calling groups_plugins_play to load vars for managed-node3 41445 1727204189.91927: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204189.92153: done with get_vars() 41445 1727204189.92163: done getting variables 41445 1727204189.92216: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:56:29 -0400 (0:00:00.032) 0:00:08.710 ***** 41445 1727204189.92248: entering _queue_task() for managed-node3/set_fact 41445 1727204189.92501: worker is 1 (out of 1 available) 41445 1727204189.92514: exiting _queue_task() for managed-node3/set_fact 41445 1727204189.92527: done queuing things up, now waiting for results queue to drain 41445 1727204189.92529: waiting for pending results... 41445 1727204189.92800: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 41445 1727204189.92957: in run() - task 028d2410-947f-bf02-eee4-00000000051a 41445 1727204189.92982: variable 'ansible_search_path' from source: unknown 41445 1727204189.92991: variable 'ansible_search_path' from source: unknown 41445 1727204189.93035: calling self._execute() 41445 1727204189.93130: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204189.93142: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204189.93157: variable 'omit' from source: magic vars 41445 1727204189.93529: variable 'ansible_distribution_major_version' from source: facts 41445 1727204189.93551: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204189.93719: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41445 1727204189.94065: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41445 1727204189.94117: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41445 1727204189.94154: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41445 1727204189.94193: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41445 1727204189.94309: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41445 1727204189.94313: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41445 1727204189.94336: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204189.94366: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41445 1727204189.94457: variable '__network_is_ostree' from source: set_fact 41445 1727204189.94469: Evaluated conditional (not __network_is_ostree is defined): False 41445 1727204189.94477: when evaluation is False, skipping this task 41445 1727204189.94524: _execute() done 41445 1727204189.94527: dumping result to json 41445 1727204189.94530: done dumping result, returning 41445 1727204189.94532: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [028d2410-947f-bf02-eee4-00000000051a] 41445 1727204189.94535: sending task result for task 028d2410-947f-bf02-eee4-00000000051a skipping: [managed-node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 41445 1727204189.94818: no more pending results, returning what we have 41445 1727204189.94821: results queue empty 41445 1727204189.94822: checking for any_errors_fatal 41445 1727204189.94828: done checking for any_errors_fatal 41445 1727204189.94828: checking for max_fail_percentage 41445 1727204189.94830: done checking for max_fail_percentage 41445 1727204189.94831: checking to see if all hosts have failed and the running result is not ok 41445 1727204189.94832: done checking to see if all hosts have failed 41445 1727204189.94833: getting the remaining hosts for this loop 41445 1727204189.94834: done getting the remaining hosts for this loop 41445 1727204189.94837: getting the next task for host managed-node3 41445 1727204189.94846: done getting next task for host managed-node3 41445 1727204189.94849: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 41445 1727204189.94852: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204189.94865: getting variables 41445 1727204189.94867: in VariableManager get_vars() 41445 1727204189.94908: Calling all_inventory to load vars for managed-node3 41445 1727204189.94910: Calling groups_inventory to load vars for managed-node3 41445 1727204189.94913: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204189.94923: Calling all_plugins_play to load vars for managed-node3 41445 1727204189.94926: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204189.94929: Calling groups_plugins_play to load vars for managed-node3 41445 1727204189.95225: done sending task result for task 028d2410-947f-bf02-eee4-00000000051a 41445 1727204189.95229: WORKER PROCESS EXITING 41445 1727204189.95251: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204189.95473: done with get_vars() 41445 1727204189.95485: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:56:29 -0400 (0:00:00.033) 0:00:08.743 ***** 41445 1727204189.95581: entering _queue_task() for managed-node3/service_facts 41445 1727204189.95583: Creating lock for service_facts 41445 1727204189.95838: worker is 1 (out of 1 available) 41445 1727204189.95852: exiting _queue_task() for managed-node3/service_facts 41445 1727204189.95862: done queuing things up, now waiting for results queue to drain 41445 1727204189.95863: waiting for pending results... 41445 1727204189.96128: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which services are running 41445 1727204189.96285: in run() - task 028d2410-947f-bf02-eee4-00000000051c 41445 1727204189.96310: variable 'ansible_search_path' from source: unknown 41445 1727204189.96319: variable 'ansible_search_path' from source: unknown 41445 1727204189.96359: calling self._execute() 41445 1727204189.96449: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204189.96461: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204189.96479: variable 'omit' from source: magic vars 41445 1727204189.96842: variable 'ansible_distribution_major_version' from source: facts 41445 1727204189.96858: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204189.96868: variable 'omit' from source: magic vars 41445 1727204189.96936: variable 'omit' from source: magic vars 41445 1727204189.96980: variable 'omit' from source: magic vars 41445 1727204189.97025: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41445 1727204189.97068: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41445 1727204189.97093: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41445 1727204189.97117: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204189.97134: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204189.97172: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41445 1727204189.97183: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204189.97191: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204189.97291: Set connection var ansible_shell_executable to /bin/sh 41445 1727204189.97300: Set connection var ansible_shell_type to sh 41445 1727204189.97309: Set connection var ansible_pipelining to False 41445 1727204189.97321: Set connection var ansible_timeout to 10 41445 1727204189.97328: Set connection var ansible_connection to ssh 41445 1727204189.97339: Set connection var ansible_module_compression to ZIP_DEFLATED 41445 1727204189.97367: variable 'ansible_shell_executable' from source: unknown 41445 1727204189.97378: variable 'ansible_connection' from source: unknown 41445 1727204189.97389: variable 'ansible_module_compression' from source: unknown 41445 1727204189.97396: variable 'ansible_shell_type' from source: unknown 41445 1727204189.97404: variable 'ansible_shell_executable' from source: unknown 41445 1727204189.97411: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204189.97419: variable 'ansible_pipelining' from source: unknown 41445 1727204189.97426: variable 'ansible_timeout' from source: unknown 41445 1727204189.97492: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204189.97634: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 41445 1727204189.97651: variable 'omit' from source: magic vars 41445 1727204189.97661: starting attempt loop 41445 1727204189.97668: running the handler 41445 1727204189.97690: _low_level_execute_command(): starting 41445 1727204189.97702: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41445 1727204189.98423: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204189.98439: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204189.98458: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204189.98480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204189.98581: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204189.98610: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204189.98624: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204189.98695: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204190.00355: stdout chunk (state=3): >>>/root <<< 41445 1727204190.00520: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204190.00524: stdout chunk (state=3): >>><<< 41445 1727204190.00526: stderr chunk (state=3): >>><<< 41445 1727204190.00652: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204190.00656: _low_level_execute_command(): starting 41445 1727204190.00659: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204190.005516-42473-280695168656336 `" && echo ansible-tmp-1727204190.005516-42473-280695168656336="` echo /root/.ansible/tmp/ansible-tmp-1727204190.005516-42473-280695168656336 `" ) && sleep 0' 41445 1727204190.01235: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204190.01249: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204190.01261: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204190.01285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204190.01304: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 <<< 41445 1727204190.01394: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204190.01435: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204190.01460: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204190.01542: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204190.03401: stdout chunk (state=3): >>>ansible-tmp-1727204190.005516-42473-280695168656336=/root/.ansible/tmp/ansible-tmp-1727204190.005516-42473-280695168656336 <<< 41445 1727204190.03570: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204190.03574: stdout chunk (state=3): >>><<< 41445 1727204190.03579: stderr chunk (state=3): >>><<< 41445 1727204190.03781: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204190.005516-42473-280695168656336=/root/.ansible/tmp/ansible-tmp-1727204190.005516-42473-280695168656336 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204190.03785: variable 'ansible_module_compression' from source: unknown 41445 1727204190.03788: ANSIBALLZ: Using lock for service_facts 41445 1727204190.03790: ANSIBALLZ: Acquiring lock 41445 1727204190.03792: ANSIBALLZ: Lock acquired: 140182277903456 41445 1727204190.03794: ANSIBALLZ: Creating module 41445 1727204190.23747: ANSIBALLZ: Writing module into payload 41445 1727204190.23852: ANSIBALLZ: Writing module 41445 1727204190.23885: ANSIBALLZ: Renaming module 41445 1727204190.23899: ANSIBALLZ: Done creating module 41445 1727204190.23923: variable 'ansible_facts' from source: unknown 41445 1727204190.24004: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204190.005516-42473-280695168656336/AnsiballZ_service_facts.py 41445 1727204190.24172: Sending initial data 41445 1727204190.24185: Sent initial data (161 bytes) 41445 1727204190.24797: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204190.24817: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204190.24894: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204190.24942: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204190.24961: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204190.24987: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204190.25085: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204190.27082: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41445 1727204190.27129: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41445 1727204190.27163: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-414450s0ylvj1/tmpgizfdn01 /root/.ansible/tmp/ansible-tmp-1727204190.005516-42473-280695168656336/AnsiballZ_service_facts.py <<< 41445 1727204190.27166: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204190.005516-42473-280695168656336/AnsiballZ_service_facts.py" <<< 41445 1727204190.27226: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-414450s0ylvj1/tmpgizfdn01" to remote "/root/.ansible/tmp/ansible-tmp-1727204190.005516-42473-280695168656336/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204190.005516-42473-280695168656336/AnsiballZ_service_facts.py" <<< 41445 1727204190.27991: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204190.28034: stderr chunk (state=3): >>><<< 41445 1727204190.28042: stdout chunk (state=3): >>><<< 41445 1727204190.28070: done transferring module to remote 41445 1727204190.28083: _low_level_execute_command(): starting 41445 1727204190.28093: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204190.005516-42473-280695168656336/ /root/.ansible/tmp/ansible-tmp-1727204190.005516-42473-280695168656336/AnsiballZ_service_facts.py && sleep 0' 41445 1727204190.28791: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204190.28816: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204190.28830: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204190.28851: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204190.28901: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204190.30795: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204190.30806: stdout chunk (state=3): >>><<< 41445 1727204190.30816: stderr chunk (state=3): >>><<< 41445 1727204190.30819: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204190.30822: _low_level_execute_command(): starting 41445 1727204190.30825: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204190.005516-42473-280695168656336/AnsiballZ_service_facts.py && sleep 0' 41445 1727204190.31464: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204190.31469: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204190.31517: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204191.82071: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-ma<<< 41445 1727204191.82105: stdout chunk (state=3): >>>rk.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "stat<<< 41445 1727204191.82111: stdout chunk (state=3): >>>us": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive",<<< 41445 1727204191.82115: stdout chunk (state=3): >>> "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "syste<<< 41445 1727204191.82128: stdout chunk (state=3): >>>md-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 41445 1727204191.83620: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. <<< 41445 1727204191.83657: stderr chunk (state=3): >>><<< 41445 1727204191.83660: stdout chunk (state=3): >>><<< 41445 1727204191.83683: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. 41445 1727204191.84057: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204190.005516-42473-280695168656336/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41445 1727204191.84068: _low_level_execute_command(): starting 41445 1727204191.84071: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204190.005516-42473-280695168656336/ > /dev/null 2>&1 && sleep 0' 41445 1727204191.84542: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204191.84545: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204191.84548: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204191.84550: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204191.84600: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204191.84611: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204191.84613: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204191.84640: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204191.86398: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204191.86426: stderr chunk (state=3): >>><<< 41445 1727204191.86429: stdout chunk (state=3): >>><<< 41445 1727204191.86443: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204191.86449: handler run complete 41445 1727204191.86560: variable 'ansible_facts' from source: unknown 41445 1727204191.86653: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204191.86915: variable 'ansible_facts' from source: unknown 41445 1727204191.87000: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204191.87119: attempt loop complete, returning result 41445 1727204191.87122: _execute() done 41445 1727204191.87124: dumping result to json 41445 1727204191.87156: done dumping result, returning 41445 1727204191.87164: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which services are running [028d2410-947f-bf02-eee4-00000000051c] 41445 1727204191.87169: sending task result for task 028d2410-947f-bf02-eee4-00000000051c ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 41445 1727204191.87715: no more pending results, returning what we have 41445 1727204191.87717: results queue empty 41445 1727204191.87718: checking for any_errors_fatal 41445 1727204191.87721: done checking for any_errors_fatal 41445 1727204191.87722: checking for max_fail_percentage 41445 1727204191.87724: done checking for max_fail_percentage 41445 1727204191.87724: checking to see if all hosts have failed and the running result is not ok 41445 1727204191.87725: done checking to see if all hosts have failed 41445 1727204191.87726: getting the remaining hosts for this loop 41445 1727204191.87727: done getting the remaining hosts for this loop 41445 1727204191.87730: getting the next task for host managed-node3 41445 1727204191.87735: done getting next task for host managed-node3 41445 1727204191.87738: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 41445 1727204191.87740: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204191.87748: getting variables 41445 1727204191.87749: in VariableManager get_vars() 41445 1727204191.87789: Calling all_inventory to load vars for managed-node3 41445 1727204191.87792: Calling groups_inventory to load vars for managed-node3 41445 1727204191.87794: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204191.87799: done sending task result for task 028d2410-947f-bf02-eee4-00000000051c 41445 1727204191.87807: Calling all_plugins_play to load vars for managed-node3 41445 1727204191.87809: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204191.87812: Calling groups_plugins_play to load vars for managed-node3 41445 1727204191.88054: WORKER PROCESS EXITING 41445 1727204191.88064: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204191.88354: done with get_vars() 41445 1727204191.88363: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:56:31 -0400 (0:00:01.928) 0:00:10.672 ***** 41445 1727204191.88438: entering _queue_task() for managed-node3/package_facts 41445 1727204191.88440: Creating lock for package_facts 41445 1727204191.88652: worker is 1 (out of 1 available) 41445 1727204191.88663: exiting _queue_task() for managed-node3/package_facts 41445 1727204191.88677: done queuing things up, now waiting for results queue to drain 41445 1727204191.88678: waiting for pending results... 41445 1727204191.88837: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 41445 1727204191.88935: in run() - task 028d2410-947f-bf02-eee4-00000000051d 41445 1727204191.88946: variable 'ansible_search_path' from source: unknown 41445 1727204191.88950: variable 'ansible_search_path' from source: unknown 41445 1727204191.88978: calling self._execute() 41445 1727204191.89045: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204191.89049: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204191.89057: variable 'omit' from source: magic vars 41445 1727204191.89317: variable 'ansible_distribution_major_version' from source: facts 41445 1727204191.89327: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204191.89332: variable 'omit' from source: magic vars 41445 1727204191.89380: variable 'omit' from source: magic vars 41445 1727204191.89403: variable 'omit' from source: magic vars 41445 1727204191.89434: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41445 1727204191.89463: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41445 1727204191.89479: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41445 1727204191.89492: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204191.89503: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204191.89527: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41445 1727204191.89530: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204191.89532: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204191.89601: Set connection var ansible_shell_executable to /bin/sh 41445 1727204191.89605: Set connection var ansible_shell_type to sh 41445 1727204191.89607: Set connection var ansible_pipelining to False 41445 1727204191.89615: Set connection var ansible_timeout to 10 41445 1727204191.89618: Set connection var ansible_connection to ssh 41445 1727204191.89624: Set connection var ansible_module_compression to ZIP_DEFLATED 41445 1727204191.89642: variable 'ansible_shell_executable' from source: unknown 41445 1727204191.89645: variable 'ansible_connection' from source: unknown 41445 1727204191.89648: variable 'ansible_module_compression' from source: unknown 41445 1727204191.89650: variable 'ansible_shell_type' from source: unknown 41445 1727204191.89652: variable 'ansible_shell_executable' from source: unknown 41445 1727204191.89654: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204191.89659: variable 'ansible_pipelining' from source: unknown 41445 1727204191.89663: variable 'ansible_timeout' from source: unknown 41445 1727204191.89665: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204191.89805: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 41445 1727204191.89896: variable 'omit' from source: magic vars 41445 1727204191.89899: starting attempt loop 41445 1727204191.89903: running the handler 41445 1727204191.89906: _low_level_execute_command(): starting 41445 1727204191.89908: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41445 1727204191.90339: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204191.90343: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204191.90345: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204191.90348: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204191.90401: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204191.90405: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204191.90407: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204191.90444: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204191.92015: stdout chunk (state=3): >>>/root <<< 41445 1727204191.92114: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204191.92140: stderr chunk (state=3): >>><<< 41445 1727204191.92143: stdout chunk (state=3): >>><<< 41445 1727204191.92162: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204191.92177: _low_level_execute_command(): starting 41445 1727204191.92184: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204191.921607-42532-153856815838162 `" && echo ansible-tmp-1727204191.921607-42532-153856815838162="` echo /root/.ansible/tmp/ansible-tmp-1727204191.921607-42532-153856815838162 `" ) && sleep 0' 41445 1727204191.92613: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204191.92616: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found <<< 41445 1727204191.92618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 41445 1727204191.92628: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204191.92631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204191.92667: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204191.92671: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204191.92715: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204191.94570: stdout chunk (state=3): >>>ansible-tmp-1727204191.921607-42532-153856815838162=/root/.ansible/tmp/ansible-tmp-1727204191.921607-42532-153856815838162 <<< 41445 1727204191.94675: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204191.94705: stderr chunk (state=3): >>><<< 41445 1727204191.94711: stdout chunk (state=3): >>><<< 41445 1727204191.94721: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204191.921607-42532-153856815838162=/root/.ansible/tmp/ansible-tmp-1727204191.921607-42532-153856815838162 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204191.94755: variable 'ansible_module_compression' from source: unknown 41445 1727204191.94797: ANSIBALLZ: Using lock for package_facts 41445 1727204191.94800: ANSIBALLZ: Acquiring lock 41445 1727204191.94803: ANSIBALLZ: Lock acquired: 140182283644480 41445 1727204191.94805: ANSIBALLZ: Creating module 41445 1727204192.13109: ANSIBALLZ: Writing module into payload 41445 1727204192.13200: ANSIBALLZ: Writing module 41445 1727204192.13231: ANSIBALLZ: Renaming module 41445 1727204192.13235: ANSIBALLZ: Done creating module 41445 1727204192.13262: variable 'ansible_facts' from source: unknown 41445 1727204192.13385: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204191.921607-42532-153856815838162/AnsiballZ_package_facts.py 41445 1727204192.13498: Sending initial data 41445 1727204192.13501: Sent initial data (161 bytes) 41445 1727204192.13949: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204192.13952: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204192.13963: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204192.14019: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204192.14023: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204192.14034: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204192.14083: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204192.15708: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 41445 1727204192.15720: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41445 1727204192.15737: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41445 1727204192.15772: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-414450s0ylvj1/tmp03s_spxs /root/.ansible/tmp/ansible-tmp-1727204191.921607-42532-153856815838162/AnsiballZ_package_facts.py <<< 41445 1727204192.15785: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204191.921607-42532-153856815838162/AnsiballZ_package_facts.py" <<< 41445 1727204192.15805: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-414450s0ylvj1/tmp03s_spxs" to remote "/root/.ansible/tmp/ansible-tmp-1727204191.921607-42532-153856815838162/AnsiballZ_package_facts.py" <<< 41445 1727204192.15812: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204191.921607-42532-153856815838162/AnsiballZ_package_facts.py" <<< 41445 1727204192.16823: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204192.16865: stderr chunk (state=3): >>><<< 41445 1727204192.16868: stdout chunk (state=3): >>><<< 41445 1727204192.16892: done transferring module to remote 41445 1727204192.16904: _low_level_execute_command(): starting 41445 1727204192.16907: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204191.921607-42532-153856815838162/ /root/.ansible/tmp/ansible-tmp-1727204191.921607-42532-153856815838162/AnsiballZ_package_facts.py && sleep 0' 41445 1727204192.17357: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204192.17361: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found <<< 41445 1727204192.17363: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204192.17369: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204192.17371: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204192.17420: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204192.17423: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204192.17462: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204192.19215: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204192.19240: stderr chunk (state=3): >>><<< 41445 1727204192.19245: stdout chunk (state=3): >>><<< 41445 1727204192.19261: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204192.19264: _low_level_execute_command(): starting 41445 1727204192.19268: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204191.921607-42532-153856815838162/AnsiballZ_package_facts.py && sleep 0' 41445 1727204192.19715: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204192.19719: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204192.19731: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204192.19791: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204192.19797: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204192.19807: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204192.19839: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204192.63758: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 41445 1727204192.63777: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "rele<<< 41445 1727204192.63811: stdout chunk (state=3): >>>ase": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10",<<< 41445 1727204192.63836: stdout chunk (state=3): >>> "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source":<<< 41445 1727204192.63840: stdout chunk (state=3): >>> "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "<<< 41445 1727204192.63883: stdout chunk (state=3): >>>x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [<<< 41445 1727204192.63922: stdout chunk (state=3): >>>{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "<<< 41445 1727204192.63928: stdout chunk (state=3): >>>3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name":<<< 41445 1727204192.63932: stdout chunk (state=3): >>> "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch<<< 41445 1727204192.63935: stdout chunk (state=3): >>>": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch",<<< 41445 1727204192.63966: stdout chunk (state=3): >>> "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch<<< 41445 1727204192.63978: stdout chunk (state=3): >>>": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 41445 1727204192.65681: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. <<< 41445 1727204192.65714: stderr chunk (state=3): >>><<< 41445 1727204192.65717: stdout chunk (state=3): >>><<< 41445 1727204192.65752: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. 41445 1727204192.67455: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204191.921607-42532-153856815838162/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41445 1727204192.67473: _low_level_execute_command(): starting 41445 1727204192.67477: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204191.921607-42532-153856815838162/ > /dev/null 2>&1 && sleep 0' 41445 1727204192.67938: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204192.67941: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found <<< 41445 1727204192.67943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204192.67946: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204192.67947: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found <<< 41445 1727204192.67949: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204192.68004: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204192.68008: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204192.68010: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204192.68044: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204192.69849: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204192.69874: stderr chunk (state=3): >>><<< 41445 1727204192.69879: stdout chunk (state=3): >>><<< 41445 1727204192.69893: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204192.69905: handler run complete 41445 1727204192.70346: variable 'ansible_facts' from source: unknown 41445 1727204192.73410: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204192.74530: variable 'ansible_facts' from source: unknown 41445 1727204192.74757: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204192.75139: attempt loop complete, returning result 41445 1727204192.75151: _execute() done 41445 1727204192.75154: dumping result to json 41445 1727204192.75268: done dumping result, returning 41445 1727204192.75278: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [028d2410-947f-bf02-eee4-00000000051d] 41445 1727204192.75280: sending task result for task 028d2410-947f-bf02-eee4-00000000051d 41445 1727204192.77197: done sending task result for task 028d2410-947f-bf02-eee4-00000000051d 41445 1727204192.77200: WORKER PROCESS EXITING ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 41445 1727204192.77304: no more pending results, returning what we have 41445 1727204192.77307: results queue empty 41445 1727204192.77308: checking for any_errors_fatal 41445 1727204192.77312: done checking for any_errors_fatal 41445 1727204192.77312: checking for max_fail_percentage 41445 1727204192.77314: done checking for max_fail_percentage 41445 1727204192.77315: checking to see if all hosts have failed and the running result is not ok 41445 1727204192.77315: done checking to see if all hosts have failed 41445 1727204192.77316: getting the remaining hosts for this loop 41445 1727204192.77317: done getting the remaining hosts for this loop 41445 1727204192.77320: getting the next task for host managed-node3 41445 1727204192.77326: done getting next task for host managed-node3 41445 1727204192.77330: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 41445 1727204192.77332: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204192.77342: getting variables 41445 1727204192.77343: in VariableManager get_vars() 41445 1727204192.77379: Calling all_inventory to load vars for managed-node3 41445 1727204192.77382: Calling groups_inventory to load vars for managed-node3 41445 1727204192.77384: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204192.77392: Calling all_plugins_play to load vars for managed-node3 41445 1727204192.77395: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204192.77397: Calling groups_plugins_play to load vars for managed-node3 41445 1727204192.78273: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204192.79156: done with get_vars() 41445 1727204192.79179: done getting variables 41445 1727204192.79225: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:56:32 -0400 (0:00:00.908) 0:00:11.580 ***** 41445 1727204192.79256: entering _queue_task() for managed-node3/debug 41445 1727204192.79498: worker is 1 (out of 1 available) 41445 1727204192.79515: exiting _queue_task() for managed-node3/debug 41445 1727204192.79527: done queuing things up, now waiting for results queue to drain 41445 1727204192.79528: waiting for pending results... 41445 1727204192.79706: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider 41445 1727204192.79800: in run() - task 028d2410-947f-bf02-eee4-000000000017 41445 1727204192.79813: variable 'ansible_search_path' from source: unknown 41445 1727204192.79822: variable 'ansible_search_path' from source: unknown 41445 1727204192.79942: calling self._execute() 41445 1727204192.79985: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204192.79996: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204192.80011: variable 'omit' from source: magic vars 41445 1727204192.80397: variable 'ansible_distribution_major_version' from source: facts 41445 1727204192.80417: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204192.80428: variable 'omit' from source: magic vars 41445 1727204192.80485: variable 'omit' from source: magic vars 41445 1727204192.80590: variable 'network_provider' from source: set_fact 41445 1727204192.80681: variable 'omit' from source: magic vars 41445 1727204192.80684: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41445 1727204192.80698: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41445 1727204192.80725: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41445 1727204192.80746: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204192.80762: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204192.80795: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41445 1727204192.80803: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204192.80813: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204192.80927: Set connection var ansible_shell_executable to /bin/sh 41445 1727204192.80931: Set connection var ansible_shell_type to sh 41445 1727204192.80935: Set connection var ansible_pipelining to False 41445 1727204192.80948: Set connection var ansible_timeout to 10 41445 1727204192.81180: Set connection var ansible_connection to ssh 41445 1727204192.81183: Set connection var ansible_module_compression to ZIP_DEFLATED 41445 1727204192.81185: variable 'ansible_shell_executable' from source: unknown 41445 1727204192.81187: variable 'ansible_connection' from source: unknown 41445 1727204192.81189: variable 'ansible_module_compression' from source: unknown 41445 1727204192.81191: variable 'ansible_shell_type' from source: unknown 41445 1727204192.81193: variable 'ansible_shell_executable' from source: unknown 41445 1727204192.81194: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204192.81196: variable 'ansible_pipelining' from source: unknown 41445 1727204192.81198: variable 'ansible_timeout' from source: unknown 41445 1727204192.81200: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204192.81202: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41445 1727204192.81204: variable 'omit' from source: magic vars 41445 1727204192.81206: starting attempt loop 41445 1727204192.81208: running the handler 41445 1727204192.81249: handler run complete 41445 1727204192.81268: attempt loop complete, returning result 41445 1727204192.81275: _execute() done 41445 1727204192.81284: dumping result to json 41445 1727204192.81291: done dumping result, returning 41445 1727204192.81302: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider [028d2410-947f-bf02-eee4-000000000017] 41445 1727204192.81315: sending task result for task 028d2410-947f-bf02-eee4-000000000017 ok: [managed-node3] => {} MSG: Using network provider: nm 41445 1727204192.81687: no more pending results, returning what we have 41445 1727204192.81691: results queue empty 41445 1727204192.81692: checking for any_errors_fatal 41445 1727204192.81699: done checking for any_errors_fatal 41445 1727204192.81699: checking for max_fail_percentage 41445 1727204192.81701: done checking for max_fail_percentage 41445 1727204192.81702: checking to see if all hosts have failed and the running result is not ok 41445 1727204192.81702: done checking to see if all hosts have failed 41445 1727204192.81703: getting the remaining hosts for this loop 41445 1727204192.81704: done getting the remaining hosts for this loop 41445 1727204192.81707: getting the next task for host managed-node3 41445 1727204192.81712: done getting next task for host managed-node3 41445 1727204192.81716: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 41445 1727204192.81718: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204192.81727: getting variables 41445 1727204192.81729: in VariableManager get_vars() 41445 1727204192.81766: Calling all_inventory to load vars for managed-node3 41445 1727204192.81769: Calling groups_inventory to load vars for managed-node3 41445 1727204192.81771: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204192.81788: Calling all_plugins_play to load vars for managed-node3 41445 1727204192.81791: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204192.81797: done sending task result for task 028d2410-947f-bf02-eee4-000000000017 41445 1727204192.81799: WORKER PROCESS EXITING 41445 1727204192.81803: Calling groups_plugins_play to load vars for managed-node3 41445 1727204192.83194: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204192.85006: done with get_vars() 41445 1727204192.85040: done getting variables 41445 1727204192.85108: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:56:32 -0400 (0:00:00.058) 0:00:11.639 ***** 41445 1727204192.85149: entering _queue_task() for managed-node3/fail 41445 1727204192.85601: worker is 1 (out of 1 available) 41445 1727204192.85613: exiting _queue_task() for managed-node3/fail 41445 1727204192.85624: done queuing things up, now waiting for results queue to drain 41445 1727204192.85625: waiting for pending results... 41445 1727204192.85825: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 41445 1727204192.85995: in run() - task 028d2410-947f-bf02-eee4-000000000018 41445 1727204192.86017: variable 'ansible_search_path' from source: unknown 41445 1727204192.86026: variable 'ansible_search_path' from source: unknown 41445 1727204192.86079: calling self._execute() 41445 1727204192.86174: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204192.86268: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204192.86272: variable 'omit' from source: magic vars 41445 1727204192.86598: variable 'ansible_distribution_major_version' from source: facts 41445 1727204192.86621: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204192.86751: variable 'network_state' from source: role '' defaults 41445 1727204192.86766: Evaluated conditional (network_state != {}): False 41445 1727204192.86774: when evaluation is False, skipping this task 41445 1727204192.86784: _execute() done 41445 1727204192.86792: dumping result to json 41445 1727204192.86800: done dumping result, returning 41445 1727204192.86815: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [028d2410-947f-bf02-eee4-000000000018] 41445 1727204192.86832: sending task result for task 028d2410-947f-bf02-eee4-000000000018 41445 1727204192.86990: done sending task result for task 028d2410-947f-bf02-eee4-000000000018 41445 1727204192.86995: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 41445 1727204192.87074: no more pending results, returning what we have 41445 1727204192.87080: results queue empty 41445 1727204192.87081: checking for any_errors_fatal 41445 1727204192.87086: done checking for any_errors_fatal 41445 1727204192.87087: checking for max_fail_percentage 41445 1727204192.87089: done checking for max_fail_percentage 41445 1727204192.87090: checking to see if all hosts have failed and the running result is not ok 41445 1727204192.87091: done checking to see if all hosts have failed 41445 1727204192.87092: getting the remaining hosts for this loop 41445 1727204192.87093: done getting the remaining hosts for this loop 41445 1727204192.87097: getting the next task for host managed-node3 41445 1727204192.87107: done getting next task for host managed-node3 41445 1727204192.87110: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 41445 1727204192.87114: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204192.87129: getting variables 41445 1727204192.87132: in VariableManager get_vars() 41445 1727204192.87395: Calling all_inventory to load vars for managed-node3 41445 1727204192.87398: Calling groups_inventory to load vars for managed-node3 41445 1727204192.87401: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204192.87410: Calling all_plugins_play to load vars for managed-node3 41445 1727204192.87412: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204192.87415: Calling groups_plugins_play to load vars for managed-node3 41445 1727204192.88458: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204192.89328: done with get_vars() 41445 1727204192.89347: done getting variables 41445 1727204192.89394: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:56:32 -0400 (0:00:00.042) 0:00:11.681 ***** 41445 1727204192.89420: entering _queue_task() for managed-node3/fail 41445 1727204192.89654: worker is 1 (out of 1 available) 41445 1727204192.89668: exiting _queue_task() for managed-node3/fail 41445 1727204192.89682: done queuing things up, now waiting for results queue to drain 41445 1727204192.89684: waiting for pending results... 41445 1727204192.89883: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 41445 1727204192.90083: in run() - task 028d2410-947f-bf02-eee4-000000000019 41445 1727204192.90087: variable 'ansible_search_path' from source: unknown 41445 1727204192.90089: variable 'ansible_search_path' from source: unknown 41445 1727204192.90093: calling self._execute() 41445 1727204192.90149: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204192.90161: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204192.90178: variable 'omit' from source: magic vars 41445 1727204192.90542: variable 'ansible_distribution_major_version' from source: facts 41445 1727204192.90558: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204192.90674: variable 'network_state' from source: role '' defaults 41445 1727204192.90693: Evaluated conditional (network_state != {}): False 41445 1727204192.90701: when evaluation is False, skipping this task 41445 1727204192.90709: _execute() done 41445 1727204192.90717: dumping result to json 41445 1727204192.90725: done dumping result, returning 41445 1727204192.90735: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [028d2410-947f-bf02-eee4-000000000019] 41445 1727204192.90745: sending task result for task 028d2410-947f-bf02-eee4-000000000019 skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 41445 1727204192.91031: no more pending results, returning what we have 41445 1727204192.91035: results queue empty 41445 1727204192.91035: checking for any_errors_fatal 41445 1727204192.91042: done checking for any_errors_fatal 41445 1727204192.91043: checking for max_fail_percentage 41445 1727204192.91045: done checking for max_fail_percentage 41445 1727204192.91046: checking to see if all hosts have failed and the running result is not ok 41445 1727204192.91047: done checking to see if all hosts have failed 41445 1727204192.91047: getting the remaining hosts for this loop 41445 1727204192.91051: done getting the remaining hosts for this loop 41445 1727204192.91055: getting the next task for host managed-node3 41445 1727204192.91064: done getting next task for host managed-node3 41445 1727204192.91068: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 41445 1727204192.91070: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204192.91091: getting variables 41445 1727204192.91093: in VariableManager get_vars() 41445 1727204192.91135: Calling all_inventory to load vars for managed-node3 41445 1727204192.91138: Calling groups_inventory to load vars for managed-node3 41445 1727204192.91140: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204192.91149: Calling all_plugins_play to load vars for managed-node3 41445 1727204192.91151: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204192.91153: Calling groups_plugins_play to load vars for managed-node3 41445 1727204192.91688: done sending task result for task 028d2410-947f-bf02-eee4-000000000019 41445 1727204192.91691: WORKER PROCESS EXITING 41445 1727204192.92060: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204192.92910: done with get_vars() 41445 1727204192.92928: done getting variables 41445 1727204192.92970: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:56:32 -0400 (0:00:00.035) 0:00:11.717 ***** 41445 1727204192.92994: entering _queue_task() for managed-node3/fail 41445 1727204192.93245: worker is 1 (out of 1 available) 41445 1727204192.93259: exiting _queue_task() for managed-node3/fail 41445 1727204192.93270: done queuing things up, now waiting for results queue to drain 41445 1727204192.93272: waiting for pending results... 41445 1727204192.93607: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 41445 1727204192.93663: in run() - task 028d2410-947f-bf02-eee4-00000000001a 41445 1727204192.93687: variable 'ansible_search_path' from source: unknown 41445 1727204192.93701: variable 'ansible_search_path' from source: unknown 41445 1727204192.93747: calling self._execute() 41445 1727204192.93855: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204192.93867: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204192.93885: variable 'omit' from source: magic vars 41445 1727204192.94290: variable 'ansible_distribution_major_version' from source: facts 41445 1727204192.94310: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204192.94578: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41445 1727204192.96886: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41445 1727204192.96970: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41445 1727204192.97017: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41445 1727204192.97058: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41445 1727204192.97104: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41445 1727204192.97193: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204192.97232: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204192.97309: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204192.97313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204192.97336: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204192.97447: variable 'ansible_distribution_major_version' from source: facts 41445 1727204192.97468: Evaluated conditional (ansible_distribution_major_version | int > 9): True 41445 1727204192.97598: variable 'ansible_distribution' from source: facts 41445 1727204192.97637: variable '__network_rh_distros' from source: role '' defaults 41445 1727204192.97642: Evaluated conditional (ansible_distribution in __network_rh_distros): True 41445 1727204192.98080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204192.98084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204192.98087: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204192.98089: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204192.98091: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204192.98093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204192.98095: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204192.98117: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204192.98160: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204192.98181: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204192.98234: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204192.98261: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204192.98293: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204192.98343: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204192.98362: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204192.98684: variable 'network_connections' from source: task vars 41445 1727204192.98701: variable 'interface' from source: set_fact 41445 1727204192.98783: variable 'interface' from source: set_fact 41445 1727204192.98798: variable 'interface' from source: set_fact 41445 1727204192.98863: variable 'interface' from source: set_fact 41445 1727204192.98888: variable 'network_state' from source: role '' defaults 41445 1727204192.98954: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41445 1727204192.99191: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41445 1727204192.99195: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41445 1727204192.99225: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41445 1727204192.99258: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41445 1727204192.99312: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41445 1727204192.99349: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41445 1727204192.99382: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204192.99480: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41445 1727204192.99484: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 41445 1727204192.99487: when evaluation is False, skipping this task 41445 1727204192.99489: _execute() done 41445 1727204192.99492: dumping result to json 41445 1727204192.99494: done dumping result, returning 41445 1727204192.99497: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [028d2410-947f-bf02-eee4-00000000001a] 41445 1727204192.99499: sending task result for task 028d2410-947f-bf02-eee4-00000000001a skipping: [managed-node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 41445 1727204192.99667: no more pending results, returning what we have 41445 1727204192.99672: results queue empty 41445 1727204192.99673: checking for any_errors_fatal 41445 1727204192.99681: done checking for any_errors_fatal 41445 1727204192.99682: checking for max_fail_percentage 41445 1727204192.99684: done checking for max_fail_percentage 41445 1727204192.99685: checking to see if all hosts have failed and the running result is not ok 41445 1727204192.99686: done checking to see if all hosts have failed 41445 1727204192.99687: getting the remaining hosts for this loop 41445 1727204192.99688: done getting the remaining hosts for this loop 41445 1727204192.99692: getting the next task for host managed-node3 41445 1727204192.99700: done getting next task for host managed-node3 41445 1727204192.99704: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 41445 1727204192.99706: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204192.99720: getting variables 41445 1727204192.99722: in VariableManager get_vars() 41445 1727204192.99767: Calling all_inventory to load vars for managed-node3 41445 1727204192.99770: Calling groups_inventory to load vars for managed-node3 41445 1727204192.99772: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204192.99892: done sending task result for task 028d2410-947f-bf02-eee4-00000000001a 41445 1727204192.99895: WORKER PROCESS EXITING 41445 1727204192.99906: Calling all_plugins_play to load vars for managed-node3 41445 1727204192.99910: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204192.99914: Calling groups_plugins_play to load vars for managed-node3 41445 1727204193.01510: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204193.03117: done with get_vars() 41445 1727204193.03146: done getting variables 41445 1727204193.03250: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:56:33 -0400 (0:00:00.102) 0:00:11.820 ***** 41445 1727204193.03283: entering _queue_task() for managed-node3/dnf 41445 1727204193.03719: worker is 1 (out of 1 available) 41445 1727204193.03731: exiting _queue_task() for managed-node3/dnf 41445 1727204193.03741: done queuing things up, now waiting for results queue to drain 41445 1727204193.03742: waiting for pending results... 41445 1727204193.03931: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 41445 1727204193.04085: in run() - task 028d2410-947f-bf02-eee4-00000000001b 41445 1727204193.04107: variable 'ansible_search_path' from source: unknown 41445 1727204193.04122: variable 'ansible_search_path' from source: unknown 41445 1727204193.04164: calling self._execute() 41445 1727204193.04265: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204193.04278: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204193.04294: variable 'omit' from source: magic vars 41445 1727204193.04703: variable 'ansible_distribution_major_version' from source: facts 41445 1727204193.04722: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204193.04931: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41445 1727204193.07556: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41445 1727204193.07582: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41445 1727204193.07623: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41445 1727204193.07682: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41445 1727204193.07713: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41445 1727204193.07801: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204193.07835: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204193.07883: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204193.07922: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204193.08081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204193.08085: variable 'ansible_distribution' from source: facts 41445 1727204193.08088: variable 'ansible_distribution_major_version' from source: facts 41445 1727204193.08090: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 41445 1727204193.08194: variable '__network_wireless_connections_defined' from source: role '' defaults 41445 1727204193.08339: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204193.08367: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204193.08400: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204193.08451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204193.08471: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204193.08516: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204193.08550: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204193.08583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204193.08626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204193.08680: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204193.08699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204193.08729: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204193.08765: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204193.08811: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204193.08860: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204193.08994: variable 'network_connections' from source: task vars 41445 1727204193.09011: variable 'interface' from source: set_fact 41445 1727204193.09085: variable 'interface' from source: set_fact 41445 1727204193.09184: variable 'interface' from source: set_fact 41445 1727204193.09187: variable 'interface' from source: set_fact 41445 1727204193.09237: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41445 1727204193.09420: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41445 1727204193.09476: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41445 1727204193.09535: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41445 1727204193.09585: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41445 1727204193.09713: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41445 1727204193.09836: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41445 1727204193.09848: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204193.09850: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41445 1727204193.09873: variable '__network_team_connections_defined' from source: role '' defaults 41445 1727204193.10119: variable 'network_connections' from source: task vars 41445 1727204193.10129: variable 'interface' from source: set_fact 41445 1727204193.10202: variable 'interface' from source: set_fact 41445 1727204193.10214: variable 'interface' from source: set_fact 41445 1727204193.10290: variable 'interface' from source: set_fact 41445 1727204193.10382: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 41445 1727204193.10385: when evaluation is False, skipping this task 41445 1727204193.10387: _execute() done 41445 1727204193.10389: dumping result to json 41445 1727204193.10391: done dumping result, returning 41445 1727204193.10398: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [028d2410-947f-bf02-eee4-00000000001b] 41445 1727204193.10400: sending task result for task 028d2410-947f-bf02-eee4-00000000001b skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 41445 1727204193.10557: no more pending results, returning what we have 41445 1727204193.10561: results queue empty 41445 1727204193.10562: checking for any_errors_fatal 41445 1727204193.10573: done checking for any_errors_fatal 41445 1727204193.10574: checking for max_fail_percentage 41445 1727204193.10578: done checking for max_fail_percentage 41445 1727204193.10579: checking to see if all hosts have failed and the running result is not ok 41445 1727204193.10580: done checking to see if all hosts have failed 41445 1727204193.10580: getting the remaining hosts for this loop 41445 1727204193.10582: done getting the remaining hosts for this loop 41445 1727204193.10586: getting the next task for host managed-node3 41445 1727204193.10595: done getting next task for host managed-node3 41445 1727204193.10599: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 41445 1727204193.10601: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204193.10616: getting variables 41445 1727204193.10618: in VariableManager get_vars() 41445 1727204193.10663: Calling all_inventory to load vars for managed-node3 41445 1727204193.10667: Calling groups_inventory to load vars for managed-node3 41445 1727204193.10670: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204193.10899: Calling all_plugins_play to load vars for managed-node3 41445 1727204193.10903: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204193.10907: Calling groups_plugins_play to load vars for managed-node3 41445 1727204193.11566: done sending task result for task 028d2410-947f-bf02-eee4-00000000001b 41445 1727204193.11570: WORKER PROCESS EXITING 41445 1727204193.13589: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204193.16643: done with get_vars() 41445 1727204193.16683: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 41445 1727204193.16769: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:56:33 -0400 (0:00:00.135) 0:00:11.955 ***** 41445 1727204193.16803: entering _queue_task() for managed-node3/yum 41445 1727204193.16804: Creating lock for yum 41445 1727204193.17148: worker is 1 (out of 1 available) 41445 1727204193.17162: exiting _queue_task() for managed-node3/yum 41445 1727204193.17173: done queuing things up, now waiting for results queue to drain 41445 1727204193.17175: waiting for pending results... 41445 1727204193.17524: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 41445 1727204193.17720: in run() - task 028d2410-947f-bf02-eee4-00000000001c 41445 1727204193.17770: variable 'ansible_search_path' from source: unknown 41445 1727204193.17787: variable 'ansible_search_path' from source: unknown 41445 1727204193.17827: calling self._execute() 41445 1727204193.17929: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204193.17942: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204193.17958: variable 'omit' from source: magic vars 41445 1727204193.18341: variable 'ansible_distribution_major_version' from source: facts 41445 1727204193.18360: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204193.18542: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41445 1727204193.21986: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41445 1727204193.22174: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41445 1727204193.22242: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41445 1727204193.22348: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41445 1727204193.22450: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41445 1727204193.22656: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204193.22694: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204193.22773: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204193.22823: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204193.23070: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204193.23182: variable 'ansible_distribution_major_version' from source: facts 41445 1727204193.23289: Evaluated conditional (ansible_distribution_major_version | int < 8): False 41445 1727204193.23292: when evaluation is False, skipping this task 41445 1727204193.23294: _execute() done 41445 1727204193.23296: dumping result to json 41445 1727204193.23297: done dumping result, returning 41445 1727204193.23300: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [028d2410-947f-bf02-eee4-00000000001c] 41445 1727204193.23302: sending task result for task 028d2410-947f-bf02-eee4-00000000001c 41445 1727204193.23378: done sending task result for task 028d2410-947f-bf02-eee4-00000000001c 41445 1727204193.23381: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 41445 1727204193.23443: no more pending results, returning what we have 41445 1727204193.23447: results queue empty 41445 1727204193.23448: checking for any_errors_fatal 41445 1727204193.23454: done checking for any_errors_fatal 41445 1727204193.23455: checking for max_fail_percentage 41445 1727204193.23457: done checking for max_fail_percentage 41445 1727204193.23458: checking to see if all hosts have failed and the running result is not ok 41445 1727204193.23459: done checking to see if all hosts have failed 41445 1727204193.23459: getting the remaining hosts for this loop 41445 1727204193.23460: done getting the remaining hosts for this loop 41445 1727204193.23465: getting the next task for host managed-node3 41445 1727204193.23472: done getting next task for host managed-node3 41445 1727204193.23478: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 41445 1727204193.23481: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204193.23495: getting variables 41445 1727204193.23497: in VariableManager get_vars() 41445 1727204193.23543: Calling all_inventory to load vars for managed-node3 41445 1727204193.23547: Calling groups_inventory to load vars for managed-node3 41445 1727204193.23549: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204193.23560: Calling all_plugins_play to load vars for managed-node3 41445 1727204193.23563: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204193.23565: Calling groups_plugins_play to load vars for managed-node3 41445 1727204193.26754: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204193.30616: done with get_vars() 41445 1727204193.30756: done getting variables 41445 1727204193.30823: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:56:33 -0400 (0:00:00.140) 0:00:12.096 ***** 41445 1727204193.30972: entering _queue_task() for managed-node3/fail 41445 1727204193.31746: worker is 1 (out of 1 available) 41445 1727204193.31758: exiting _queue_task() for managed-node3/fail 41445 1727204193.31768: done queuing things up, now waiting for results queue to drain 41445 1727204193.31770: waiting for pending results... 41445 1727204193.32244: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 41445 1727204193.32522: in run() - task 028d2410-947f-bf02-eee4-00000000001d 41445 1727204193.32740: variable 'ansible_search_path' from source: unknown 41445 1727204193.32745: variable 'ansible_search_path' from source: unknown 41445 1727204193.32748: calling self._execute() 41445 1727204193.32845: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204193.33176: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204193.33181: variable 'omit' from source: magic vars 41445 1727204193.33935: variable 'ansible_distribution_major_version' from source: facts 41445 1727204193.33939: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204193.34031: variable '__network_wireless_connections_defined' from source: role '' defaults 41445 1727204193.34464: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41445 1727204193.43339: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41445 1727204193.43396: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41445 1727204193.43430: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41445 1727204193.43466: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41445 1727204193.43505: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41445 1727204193.43573: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204193.43603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204193.43628: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204193.43672: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204193.43694: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204193.43728: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204193.43751: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204193.43780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204193.43818: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204193.43833: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204193.43874: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204193.43899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204193.43924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204193.43961: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204193.43980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204193.44150: variable 'network_connections' from source: task vars 41445 1727204193.44153: variable 'interface' from source: set_fact 41445 1727204193.44222: variable 'interface' from source: set_fact 41445 1727204193.44241: variable 'interface' from source: set_fact 41445 1727204193.44322: variable 'interface' from source: set_fact 41445 1727204193.44378: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41445 1727204193.44539: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41445 1727204193.44574: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41445 1727204193.44605: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41445 1727204193.44657: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41445 1727204193.44680: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41445 1727204193.44705: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41445 1727204193.44726: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204193.44755: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41445 1727204193.44803: variable '__network_team_connections_defined' from source: role '' defaults 41445 1727204193.45066: variable 'network_connections' from source: task vars 41445 1727204193.45069: variable 'interface' from source: set_fact 41445 1727204193.45125: variable 'interface' from source: set_fact 41445 1727204193.45128: variable 'interface' from source: set_fact 41445 1727204193.45200: variable 'interface' from source: set_fact 41445 1727204193.45291: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 41445 1727204193.45293: when evaluation is False, skipping this task 41445 1727204193.45295: _execute() done 41445 1727204193.45297: dumping result to json 41445 1727204193.45298: done dumping result, returning 41445 1727204193.45300: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [028d2410-947f-bf02-eee4-00000000001d] 41445 1727204193.45308: sending task result for task 028d2410-947f-bf02-eee4-00000000001d 41445 1727204193.45369: done sending task result for task 028d2410-947f-bf02-eee4-00000000001d 41445 1727204193.45372: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 41445 1727204193.45423: no more pending results, returning what we have 41445 1727204193.45426: results queue empty 41445 1727204193.45426: checking for any_errors_fatal 41445 1727204193.45433: done checking for any_errors_fatal 41445 1727204193.45434: checking for max_fail_percentage 41445 1727204193.45435: done checking for max_fail_percentage 41445 1727204193.45436: checking to see if all hosts have failed and the running result is not ok 41445 1727204193.45437: done checking to see if all hosts have failed 41445 1727204193.45438: getting the remaining hosts for this loop 41445 1727204193.45439: done getting the remaining hosts for this loop 41445 1727204193.45443: getting the next task for host managed-node3 41445 1727204193.45448: done getting next task for host managed-node3 41445 1727204193.45451: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 41445 1727204193.45454: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204193.45466: getting variables 41445 1727204193.45467: in VariableManager get_vars() 41445 1727204193.45574: Calling all_inventory to load vars for managed-node3 41445 1727204193.45579: Calling groups_inventory to load vars for managed-node3 41445 1727204193.45581: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204193.45590: Calling all_plugins_play to load vars for managed-node3 41445 1727204193.45592: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204193.45595: Calling groups_plugins_play to load vars for managed-node3 41445 1727204193.52265: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204193.53873: done with get_vars() 41445 1727204193.53898: done getting variables 41445 1727204193.53951: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:56:33 -0400 (0:00:00.231) 0:00:12.327 ***** 41445 1727204193.53986: entering _queue_task() for managed-node3/package 41445 1727204193.54435: worker is 1 (out of 1 available) 41445 1727204193.54446: exiting _queue_task() for managed-node3/package 41445 1727204193.54456: done queuing things up, now waiting for results queue to drain 41445 1727204193.54458: waiting for pending results... 41445 1727204193.54750: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages 41445 1727204193.54800: in run() - task 028d2410-947f-bf02-eee4-00000000001e 41445 1727204193.54847: variable 'ansible_search_path' from source: unknown 41445 1727204193.54850: variable 'ansible_search_path' from source: unknown 41445 1727204193.54887: calling self._execute() 41445 1727204193.55065: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204193.55069: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204193.55072: variable 'omit' from source: magic vars 41445 1727204193.55433: variable 'ansible_distribution_major_version' from source: facts 41445 1727204193.55448: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204193.55653: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41445 1727204193.55937: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41445 1727204193.55985: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41445 1727204193.56074: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41445 1727204193.56117: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41445 1727204193.56237: variable 'network_packages' from source: role '' defaults 41445 1727204193.56373: variable '__network_provider_setup' from source: role '' defaults 41445 1727204193.56378: variable '__network_service_name_default_nm' from source: role '' defaults 41445 1727204193.56446: variable '__network_service_name_default_nm' from source: role '' defaults 41445 1727204193.56460: variable '__network_packages_default_nm' from source: role '' defaults 41445 1727204193.56581: variable '__network_packages_default_nm' from source: role '' defaults 41445 1727204193.56737: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41445 1727204193.58798: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41445 1727204193.58867: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41445 1727204193.58922: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41445 1727204193.58963: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41445 1727204193.59018: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41445 1727204193.59179: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204193.59184: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204193.59187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204193.59302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204193.59305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204193.59308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204193.59335: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204193.59364: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204193.59415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204193.59441: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204193.59701: variable '__network_packages_default_gobject_packages' from source: role '' defaults 41445 1727204193.59830: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204193.59867: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204193.59900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204193.59950: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204193.59979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204193.60082: variable 'ansible_python' from source: facts 41445 1727204193.60177: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 41445 1727204193.60208: variable '__network_wpa_supplicant_required' from source: role '' defaults 41445 1727204193.60303: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 41445 1727204193.60442: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204193.60471: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204193.60516: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204193.60560: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204193.60583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204193.60644: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204193.60717: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204193.60720: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204193.60767: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204193.60791: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204193.60981: variable 'network_connections' from source: task vars 41445 1727204193.60984: variable 'interface' from source: set_fact 41445 1727204193.61092: variable 'interface' from source: set_fact 41445 1727204193.61105: variable 'interface' from source: set_fact 41445 1727204193.61260: variable 'interface' from source: set_fact 41445 1727204193.61293: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41445 1727204193.61324: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41445 1727204193.61352: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204193.61395: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41445 1727204193.61444: variable '__network_wireless_connections_defined' from source: role '' defaults 41445 1727204193.61735: variable 'network_connections' from source: task vars 41445 1727204193.61743: variable 'interface' from source: set_fact 41445 1727204193.61851: variable 'interface' from source: set_fact 41445 1727204193.61865: variable 'interface' from source: set_fact 41445 1727204193.61980: variable 'interface' from source: set_fact 41445 1727204193.62128: variable '__network_packages_default_wireless' from source: role '' defaults 41445 1727204193.62159: variable '__network_wireless_connections_defined' from source: role '' defaults 41445 1727204193.62503: variable 'network_connections' from source: task vars 41445 1727204193.62516: variable 'interface' from source: set_fact 41445 1727204193.62590: variable 'interface' from source: set_fact 41445 1727204193.62602: variable 'interface' from source: set_fact 41445 1727204193.62670: variable 'interface' from source: set_fact 41445 1727204193.62717: variable '__network_packages_default_team' from source: role '' defaults 41445 1727204193.62801: variable '__network_team_connections_defined' from source: role '' defaults 41445 1727204193.63141: variable 'network_connections' from source: task vars 41445 1727204193.63259: variable 'interface' from source: set_fact 41445 1727204193.63262: variable 'interface' from source: set_fact 41445 1727204193.63265: variable 'interface' from source: set_fact 41445 1727204193.63306: variable 'interface' from source: set_fact 41445 1727204193.63386: variable '__network_service_name_default_initscripts' from source: role '' defaults 41445 1727204193.63451: variable '__network_service_name_default_initscripts' from source: role '' defaults 41445 1727204193.63463: variable '__network_packages_default_initscripts' from source: role '' defaults 41445 1727204193.63534: variable '__network_packages_default_initscripts' from source: role '' defaults 41445 1727204193.63765: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 41445 1727204193.64464: variable 'network_connections' from source: task vars 41445 1727204193.64485: variable 'interface' from source: set_fact 41445 1727204193.64583: variable 'interface' from source: set_fact 41445 1727204193.64586: variable 'interface' from source: set_fact 41445 1727204193.64623: variable 'interface' from source: set_fact 41445 1727204193.64640: variable 'ansible_distribution' from source: facts 41445 1727204193.64646: variable '__network_rh_distros' from source: role '' defaults 41445 1727204193.64654: variable 'ansible_distribution_major_version' from source: facts 41445 1727204193.64680: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 41445 1727204193.64851: variable 'ansible_distribution' from source: facts 41445 1727204193.64912: variable '__network_rh_distros' from source: role '' defaults 41445 1727204193.64915: variable 'ansible_distribution_major_version' from source: facts 41445 1727204193.64917: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 41445 1727204193.65064: variable 'ansible_distribution' from source: facts 41445 1727204193.65073: variable '__network_rh_distros' from source: role '' defaults 41445 1727204193.65084: variable 'ansible_distribution_major_version' from source: facts 41445 1727204193.65132: variable 'network_provider' from source: set_fact 41445 1727204193.65151: variable 'ansible_facts' from source: unknown 41445 1727204193.65895: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 41445 1727204193.65903: when evaluation is False, skipping this task 41445 1727204193.66080: _execute() done 41445 1727204193.66084: dumping result to json 41445 1727204193.66086: done dumping result, returning 41445 1727204193.66088: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages [028d2410-947f-bf02-eee4-00000000001e] 41445 1727204193.66090: sending task result for task 028d2410-947f-bf02-eee4-00000000001e 41445 1727204193.66163: done sending task result for task 028d2410-947f-bf02-eee4-00000000001e 41445 1727204193.66165: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 41445 1727204193.66224: no more pending results, returning what we have 41445 1727204193.66228: results queue empty 41445 1727204193.66229: checking for any_errors_fatal 41445 1727204193.66235: done checking for any_errors_fatal 41445 1727204193.66236: checking for max_fail_percentage 41445 1727204193.66238: done checking for max_fail_percentage 41445 1727204193.66238: checking to see if all hosts have failed and the running result is not ok 41445 1727204193.66239: done checking to see if all hosts have failed 41445 1727204193.66240: getting the remaining hosts for this loop 41445 1727204193.66241: done getting the remaining hosts for this loop 41445 1727204193.66245: getting the next task for host managed-node3 41445 1727204193.66252: done getting next task for host managed-node3 41445 1727204193.66256: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 41445 1727204193.66259: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204193.66285: getting variables 41445 1727204193.66287: in VariableManager get_vars() 41445 1727204193.66339: Calling all_inventory to load vars for managed-node3 41445 1727204193.66342: Calling groups_inventory to load vars for managed-node3 41445 1727204193.66345: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204193.66355: Calling all_plugins_play to load vars for managed-node3 41445 1727204193.66358: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204193.66361: Calling groups_plugins_play to load vars for managed-node3 41445 1727204193.67927: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204193.69777: done with get_vars() 41445 1727204193.69801: done getting variables 41445 1727204193.69872: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:56:33 -0400 (0:00:00.159) 0:00:12.486 ***** 41445 1727204193.69911: entering _queue_task() for managed-node3/package 41445 1727204193.70255: worker is 1 (out of 1 available) 41445 1727204193.70269: exiting _queue_task() for managed-node3/package 41445 1727204193.70394: done queuing things up, now waiting for results queue to drain 41445 1727204193.70396: waiting for pending results... 41445 1727204193.70620: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 41445 1727204193.70691: in run() - task 028d2410-947f-bf02-eee4-00000000001f 41445 1727204193.70717: variable 'ansible_search_path' from source: unknown 41445 1727204193.70725: variable 'ansible_search_path' from source: unknown 41445 1727204193.70762: calling self._execute() 41445 1727204193.70866: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204193.70875: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204193.70890: variable 'omit' from source: magic vars 41445 1727204193.71305: variable 'ansible_distribution_major_version' from source: facts 41445 1727204193.71333: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204193.71477: variable 'network_state' from source: role '' defaults 41445 1727204193.71483: Evaluated conditional (network_state != {}): False 41445 1727204193.71583: when evaluation is False, skipping this task 41445 1727204193.71586: _execute() done 41445 1727204193.71589: dumping result to json 41445 1727204193.71591: done dumping result, returning 41445 1727204193.71594: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [028d2410-947f-bf02-eee4-00000000001f] 41445 1727204193.71597: sending task result for task 028d2410-947f-bf02-eee4-00000000001f 41445 1727204193.71673: done sending task result for task 028d2410-947f-bf02-eee4-00000000001f 41445 1727204193.71679: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 41445 1727204193.71735: no more pending results, returning what we have 41445 1727204193.71739: results queue empty 41445 1727204193.71740: checking for any_errors_fatal 41445 1727204193.71746: done checking for any_errors_fatal 41445 1727204193.71747: checking for max_fail_percentage 41445 1727204193.71750: done checking for max_fail_percentage 41445 1727204193.71751: checking to see if all hosts have failed and the running result is not ok 41445 1727204193.71752: done checking to see if all hosts have failed 41445 1727204193.71752: getting the remaining hosts for this loop 41445 1727204193.71753: done getting the remaining hosts for this loop 41445 1727204193.71757: getting the next task for host managed-node3 41445 1727204193.71764: done getting next task for host managed-node3 41445 1727204193.71769: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 41445 1727204193.71771: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204193.71789: getting variables 41445 1727204193.71791: in VariableManager get_vars() 41445 1727204193.71838: Calling all_inventory to load vars for managed-node3 41445 1727204193.71842: Calling groups_inventory to load vars for managed-node3 41445 1727204193.71844: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204193.71857: Calling all_plugins_play to load vars for managed-node3 41445 1727204193.71860: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204193.71863: Calling groups_plugins_play to load vars for managed-node3 41445 1727204193.73415: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204193.74930: done with get_vars() 41445 1727204193.74953: done getting variables 41445 1727204193.75013: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:56:33 -0400 (0:00:00.051) 0:00:12.538 ***** 41445 1727204193.75044: entering _queue_task() for managed-node3/package 41445 1727204193.75333: worker is 1 (out of 1 available) 41445 1727204193.75345: exiting _queue_task() for managed-node3/package 41445 1727204193.75356: done queuing things up, now waiting for results queue to drain 41445 1727204193.75358: waiting for pending results... 41445 1727204193.75702: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 41445 1727204193.75799: in run() - task 028d2410-947f-bf02-eee4-000000000020 41445 1727204193.75803: variable 'ansible_search_path' from source: unknown 41445 1727204193.75806: variable 'ansible_search_path' from source: unknown 41445 1727204193.75827: calling self._execute() 41445 1727204193.75931: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204193.75942: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204193.76181: variable 'omit' from source: magic vars 41445 1727204193.76335: variable 'ansible_distribution_major_version' from source: facts 41445 1727204193.76351: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204193.76478: variable 'network_state' from source: role '' defaults 41445 1727204193.76493: Evaluated conditional (network_state != {}): False 41445 1727204193.76500: when evaluation is False, skipping this task 41445 1727204193.76507: _execute() done 41445 1727204193.76518: dumping result to json 41445 1727204193.76525: done dumping result, returning 41445 1727204193.76535: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [028d2410-947f-bf02-eee4-000000000020] 41445 1727204193.76545: sending task result for task 028d2410-947f-bf02-eee4-000000000020 skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 41445 1727204193.76855: no more pending results, returning what we have 41445 1727204193.76858: results queue empty 41445 1727204193.76859: checking for any_errors_fatal 41445 1727204193.76866: done checking for any_errors_fatal 41445 1727204193.76867: checking for max_fail_percentage 41445 1727204193.76868: done checking for max_fail_percentage 41445 1727204193.76869: checking to see if all hosts have failed and the running result is not ok 41445 1727204193.76870: done checking to see if all hosts have failed 41445 1727204193.76870: getting the remaining hosts for this loop 41445 1727204193.76871: done getting the remaining hosts for this loop 41445 1727204193.76876: getting the next task for host managed-node3 41445 1727204193.76882: done getting next task for host managed-node3 41445 1727204193.76886: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 41445 1727204193.76888: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204193.76902: getting variables 41445 1727204193.76904: in VariableManager get_vars() 41445 1727204193.76945: Calling all_inventory to load vars for managed-node3 41445 1727204193.76948: Calling groups_inventory to load vars for managed-node3 41445 1727204193.76950: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204193.76961: Calling all_plugins_play to load vars for managed-node3 41445 1727204193.76964: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204193.76967: Calling groups_plugins_play to load vars for managed-node3 41445 1727204193.77489: done sending task result for task 028d2410-947f-bf02-eee4-000000000020 41445 1727204193.77492: WORKER PROCESS EXITING 41445 1727204193.78527: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204193.80073: done with get_vars() 41445 1727204193.80098: done getting variables 41445 1727204193.80200: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:56:33 -0400 (0:00:00.051) 0:00:12.590 ***** 41445 1727204193.80235: entering _queue_task() for managed-node3/service 41445 1727204193.80237: Creating lock for service 41445 1727204193.80556: worker is 1 (out of 1 available) 41445 1727204193.80570: exiting _queue_task() for managed-node3/service 41445 1727204193.80583: done queuing things up, now waiting for results queue to drain 41445 1727204193.80584: waiting for pending results... 41445 1727204193.80848: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 41445 1727204193.80972: in run() - task 028d2410-947f-bf02-eee4-000000000021 41445 1727204193.80996: variable 'ansible_search_path' from source: unknown 41445 1727204193.81005: variable 'ansible_search_path' from source: unknown 41445 1727204193.81043: calling self._execute() 41445 1727204193.81144: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204193.81154: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204193.81167: variable 'omit' from source: magic vars 41445 1727204193.81565: variable 'ansible_distribution_major_version' from source: facts 41445 1727204193.81587: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204193.81715: variable '__network_wireless_connections_defined' from source: role '' defaults 41445 1727204193.81923: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41445 1727204193.84282: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41445 1727204193.84286: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41445 1727204193.84288: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41445 1727204193.84326: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41445 1727204193.84352: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41445 1727204193.84438: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204193.84474: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204193.84511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204193.84559: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204193.84581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204193.84630: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204193.84655: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204193.84683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204193.84878: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204193.84882: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204193.84885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204193.84887: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204193.84889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204193.84890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204193.84892: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204193.85051: variable 'network_connections' from source: task vars 41445 1727204193.85067: variable 'interface' from source: set_fact 41445 1727204193.85149: variable 'interface' from source: set_fact 41445 1727204193.85163: variable 'interface' from source: set_fact 41445 1727204193.85234: variable 'interface' from source: set_fact 41445 1727204193.85317: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41445 1727204193.85503: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41445 1727204193.85550: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41445 1727204193.85588: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41445 1727204193.85623: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41445 1727204193.85672: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41445 1727204193.85701: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41445 1727204193.85733: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204193.85763: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41445 1727204193.85880: variable '__network_team_connections_defined' from source: role '' defaults 41445 1727204193.86085: variable 'network_connections' from source: task vars 41445 1727204193.86100: variable 'interface' from source: set_fact 41445 1727204193.86164: variable 'interface' from source: set_fact 41445 1727204193.86177: variable 'interface' from source: set_fact 41445 1727204193.86244: variable 'interface' from source: set_fact 41445 1727204193.86286: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 41445 1727204193.86293: when evaluation is False, skipping this task 41445 1727204193.86299: _execute() done 41445 1727204193.86479: dumping result to json 41445 1727204193.86483: done dumping result, returning 41445 1727204193.86486: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [028d2410-947f-bf02-eee4-000000000021] 41445 1727204193.86496: sending task result for task 028d2410-947f-bf02-eee4-000000000021 41445 1727204193.86565: done sending task result for task 028d2410-947f-bf02-eee4-000000000021 41445 1727204193.86568: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 41445 1727204193.86620: no more pending results, returning what we have 41445 1727204193.86624: results queue empty 41445 1727204193.86624: checking for any_errors_fatal 41445 1727204193.86633: done checking for any_errors_fatal 41445 1727204193.86634: checking for max_fail_percentage 41445 1727204193.86636: done checking for max_fail_percentage 41445 1727204193.86637: checking to see if all hosts have failed and the running result is not ok 41445 1727204193.86638: done checking to see if all hosts have failed 41445 1727204193.86638: getting the remaining hosts for this loop 41445 1727204193.86640: done getting the remaining hosts for this loop 41445 1727204193.86643: getting the next task for host managed-node3 41445 1727204193.86651: done getting next task for host managed-node3 41445 1727204193.86654: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 41445 1727204193.86657: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204193.86672: getting variables 41445 1727204193.86674: in VariableManager get_vars() 41445 1727204193.86722: Calling all_inventory to load vars for managed-node3 41445 1727204193.86725: Calling groups_inventory to load vars for managed-node3 41445 1727204193.86727: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204193.86738: Calling all_plugins_play to load vars for managed-node3 41445 1727204193.86741: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204193.86744: Calling groups_plugins_play to load vars for managed-node3 41445 1727204193.88220: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204193.89820: done with get_vars() 41445 1727204193.89844: done getting variables 41445 1727204193.89906: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:56:33 -0400 (0:00:00.097) 0:00:12.687 ***** 41445 1727204193.89941: entering _queue_task() for managed-node3/service 41445 1727204193.90252: worker is 1 (out of 1 available) 41445 1727204193.90268: exiting _queue_task() for managed-node3/service 41445 1727204193.90281: done queuing things up, now waiting for results queue to drain 41445 1727204193.90283: waiting for pending results... 41445 1727204193.90694: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 41445 1727204193.90699: in run() - task 028d2410-947f-bf02-eee4-000000000022 41445 1727204193.90723: variable 'ansible_search_path' from source: unknown 41445 1727204193.90729: variable 'ansible_search_path' from source: unknown 41445 1727204193.90765: calling self._execute() 41445 1727204193.90860: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204193.90871: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204193.90886: variable 'omit' from source: magic vars 41445 1727204193.91245: variable 'ansible_distribution_major_version' from source: facts 41445 1727204193.91263: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204193.91430: variable 'network_provider' from source: set_fact 41445 1727204193.91442: variable 'network_state' from source: role '' defaults 41445 1727204193.91457: Evaluated conditional (network_provider == "nm" or network_state != {}): True 41445 1727204193.91474: variable 'omit' from source: magic vars 41445 1727204193.91581: variable 'omit' from source: magic vars 41445 1727204193.91584: variable 'network_service_name' from source: role '' defaults 41445 1727204193.91651: variable 'network_service_name' from source: role '' defaults 41445 1727204193.91766: variable '__network_provider_setup' from source: role '' defaults 41445 1727204193.91779: variable '__network_service_name_default_nm' from source: role '' defaults 41445 1727204193.91850: variable '__network_service_name_default_nm' from source: role '' defaults 41445 1727204193.91866: variable '__network_packages_default_nm' from source: role '' defaults 41445 1727204193.91938: variable '__network_packages_default_nm' from source: role '' defaults 41445 1727204193.92181: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41445 1727204193.94600: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41445 1727204193.94881: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41445 1727204193.94884: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41445 1727204193.94886: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41445 1727204193.94888: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41445 1727204193.94891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204193.94900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204193.94933: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204193.94984: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204193.95016: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204193.95069: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204193.95104: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204193.95140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204193.95184: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204193.95202: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204193.95511: variable '__network_packages_default_gobject_packages' from source: role '' defaults 41445 1727204193.95812: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204193.95841: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204193.95868: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204193.95920: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204193.95938: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204193.96035: variable 'ansible_python' from source: facts 41445 1727204193.96061: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 41445 1727204193.96152: variable '__network_wpa_supplicant_required' from source: role '' defaults 41445 1727204193.96240: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 41445 1727204193.96371: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204193.96424: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204193.96439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204193.96485: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204193.96533: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204193.96561: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204193.96600: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204193.96631: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204193.96750: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204193.96754: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204193.96851: variable 'network_connections' from source: task vars 41445 1727204193.96870: variable 'interface' from source: set_fact 41445 1727204193.96952: variable 'interface' from source: set_fact 41445 1727204193.96973: variable 'interface' from source: set_fact 41445 1727204193.97055: variable 'interface' from source: set_fact 41445 1727204193.97213: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41445 1727204193.97429: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41445 1727204193.97481: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41445 1727204193.97532: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41445 1727204193.97573: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41445 1727204193.97680: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41445 1727204193.97683: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41445 1727204193.98069: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204193.98072: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41445 1727204193.98283: variable '__network_wireless_connections_defined' from source: role '' defaults 41445 1727204193.98654: variable 'network_connections' from source: task vars 41445 1727204193.98691: variable 'interface' from source: set_fact 41445 1727204193.98786: variable 'interface' from source: set_fact 41445 1727204193.98944: variable 'interface' from source: set_fact 41445 1727204193.99018: variable 'interface' from source: set_fact 41445 1727204193.99095: variable '__network_packages_default_wireless' from source: role '' defaults 41445 1727204193.99353: variable '__network_wireless_connections_defined' from source: role '' defaults 41445 1727204193.99857: variable 'network_connections' from source: task vars 41445 1727204194.00118: variable 'interface' from source: set_fact 41445 1727204194.00121: variable 'interface' from source: set_fact 41445 1727204194.00124: variable 'interface' from source: set_fact 41445 1727204194.00345: variable 'interface' from source: set_fact 41445 1727204194.00381: variable '__network_packages_default_team' from source: role '' defaults 41445 1727204194.00467: variable '__network_team_connections_defined' from source: role '' defaults 41445 1727204194.01090: variable 'network_connections' from source: task vars 41445 1727204194.01217: variable 'interface' from source: set_fact 41445 1727204194.01333: variable 'interface' from source: set_fact 41445 1727204194.01347: variable 'interface' from source: set_fact 41445 1727204194.01644: variable 'interface' from source: set_fact 41445 1727204194.01647: variable '__network_service_name_default_initscripts' from source: role '' defaults 41445 1727204194.01806: variable '__network_service_name_default_initscripts' from source: role '' defaults 41445 1727204194.01821: variable '__network_packages_default_initscripts' from source: role '' defaults 41445 1727204194.01998: variable '__network_packages_default_initscripts' from source: role '' defaults 41445 1727204194.02263: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 41445 1727204194.02926: variable 'network_connections' from source: task vars 41445 1727204194.02939: variable 'interface' from source: set_fact 41445 1727204194.03003: variable 'interface' from source: set_fact 41445 1727204194.03019: variable 'interface' from source: set_fact 41445 1727204194.03087: variable 'interface' from source: set_fact 41445 1727204194.03108: variable 'ansible_distribution' from source: facts 41445 1727204194.03163: variable '__network_rh_distros' from source: role '' defaults 41445 1727204194.03166: variable 'ansible_distribution_major_version' from source: facts 41445 1727204194.03168: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 41445 1727204194.03363: variable 'ansible_distribution' from source: facts 41445 1727204194.03373: variable '__network_rh_distros' from source: role '' defaults 41445 1727204194.03390: variable 'ansible_distribution_major_version' from source: facts 41445 1727204194.03408: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 41445 1727204194.03587: variable 'ansible_distribution' from source: facts 41445 1727204194.03669: variable '__network_rh_distros' from source: role '' defaults 41445 1727204194.03673: variable 'ansible_distribution_major_version' from source: facts 41445 1727204194.03718: variable 'network_provider' from source: set_fact 41445 1727204194.03781: variable 'omit' from source: magic vars 41445 1727204194.03784: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41445 1727204194.03856: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41445 1727204194.03925: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41445 1727204194.04041: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204194.04045: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204194.04048: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41445 1727204194.04050: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204194.04052: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204194.04168: Set connection var ansible_shell_executable to /bin/sh 41445 1727204194.04178: Set connection var ansible_shell_type to sh 41445 1727204194.04190: Set connection var ansible_pipelining to False 41445 1727204194.04202: Set connection var ansible_timeout to 10 41445 1727204194.04212: Set connection var ansible_connection to ssh 41445 1727204194.04225: Set connection var ansible_module_compression to ZIP_DEFLATED 41445 1727204194.04271: variable 'ansible_shell_executable' from source: unknown 41445 1727204194.04283: variable 'ansible_connection' from source: unknown 41445 1727204194.04371: variable 'ansible_module_compression' from source: unknown 41445 1727204194.04374: variable 'ansible_shell_type' from source: unknown 41445 1727204194.04378: variable 'ansible_shell_executable' from source: unknown 41445 1727204194.04381: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204194.04387: variable 'ansible_pipelining' from source: unknown 41445 1727204194.04389: variable 'ansible_timeout' from source: unknown 41445 1727204194.04391: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204194.04450: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41445 1727204194.04466: variable 'omit' from source: magic vars 41445 1727204194.04482: starting attempt loop 41445 1727204194.04490: running the handler 41445 1727204194.04571: variable 'ansible_facts' from source: unknown 41445 1727204194.05327: _low_level_execute_command(): starting 41445 1727204194.05340: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41445 1727204194.06027: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204194.06092: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204194.06112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204194.06228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204194.06374: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204194.06404: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204194.06491: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204194.08551: stdout chunk (state=3): >>>/root <<< 41445 1727204194.08555: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204194.08557: stdout chunk (state=3): >>><<< 41445 1727204194.08559: stderr chunk (state=3): >>><<< 41445 1727204194.08563: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204194.08567: _low_level_execute_command(): starting 41445 1727204194.08570: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204194.0846395-42588-245560582699898 `" && echo ansible-tmp-1727204194.0846395-42588-245560582699898="` echo /root/.ansible/tmp/ansible-tmp-1727204194.0846395-42588-245560582699898 `" ) && sleep 0' 41445 1727204194.09716: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204194.09730: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration <<< 41445 1727204194.09789: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204194.09999: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204194.10014: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204194.10028: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204194.10279: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204194.12130: stdout chunk (state=3): >>>ansible-tmp-1727204194.0846395-42588-245560582699898=/root/.ansible/tmp/ansible-tmp-1727204194.0846395-42588-245560582699898 <<< 41445 1727204194.12241: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204194.12583: stderr chunk (state=3): >>><<< 41445 1727204194.12586: stdout chunk (state=3): >>><<< 41445 1727204194.12589: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204194.0846395-42588-245560582699898=/root/.ansible/tmp/ansible-tmp-1727204194.0846395-42588-245560582699898 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204194.12591: variable 'ansible_module_compression' from source: unknown 41445 1727204194.12594: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 41445 1727204194.12597: ANSIBALLZ: Acquiring lock 41445 1727204194.12599: ANSIBALLZ: Lock acquired: 140182283768784 41445 1727204194.12601: ANSIBALLZ: Creating module 41445 1727204194.60899: ANSIBALLZ: Writing module into payload 41445 1727204194.61090: ANSIBALLZ: Writing module 41445 1727204194.61123: ANSIBALLZ: Renaming module 41445 1727204194.61135: ANSIBALLZ: Done creating module 41445 1727204194.61160: variable 'ansible_facts' from source: unknown 41445 1727204194.61371: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204194.0846395-42588-245560582699898/AnsiballZ_systemd.py 41445 1727204194.61525: Sending initial data 41445 1727204194.61528: Sent initial data (156 bytes) 41445 1727204194.62283: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204194.62298: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204194.62312: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204194.62331: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204194.62404: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204194.64025: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 41445 1727204194.64066: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41445 1727204194.64098: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41445 1727204194.64137: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-414450s0ylvj1/tmprk_8af6o /root/.ansible/tmp/ansible-tmp-1727204194.0846395-42588-245560582699898/AnsiballZ_systemd.py <<< 41445 1727204194.64159: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204194.0846395-42588-245560582699898/AnsiballZ_systemd.py" <<< 41445 1727204194.64198: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-414450s0ylvj1/tmprk_8af6o" to remote "/root/.ansible/tmp/ansible-tmp-1727204194.0846395-42588-245560582699898/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204194.0846395-42588-245560582699898/AnsiballZ_systemd.py" <<< 41445 1727204194.65722: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204194.65870: stderr chunk (state=3): >>><<< 41445 1727204194.65874: stdout chunk (state=3): >>><<< 41445 1727204194.65883: done transferring module to remote 41445 1727204194.65885: _low_level_execute_command(): starting 41445 1727204194.65888: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204194.0846395-42588-245560582699898/ /root/.ansible/tmp/ansible-tmp-1727204194.0846395-42588-245560582699898/AnsiballZ_systemd.py && sleep 0' 41445 1727204194.66452: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204194.66468: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204194.66487: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204194.66594: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204194.66608: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204194.66634: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204194.66706: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204194.68464: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204194.68482: stdout chunk (state=3): >>><<< 41445 1727204194.68495: stderr chunk (state=3): >>><<< 41445 1727204194.68517: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204194.68527: _low_level_execute_command(): starting 41445 1727204194.68537: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204194.0846395-42588-245560582699898/AnsiballZ_systemd.py && sleep 0' 41445 1727204194.69184: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204194.69199: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204194.69316: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204194.69329: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204194.69387: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204194.69435: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204194.97882: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "704", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:44:11 EDT", "ExecMainStartTimestampMonotonic": "28990148", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Tue 2024-09-24 14:44:11 EDT", "ExecMainHandoffTimestampMonotonic": "29005881", "ExecMainPID": "704", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10506240", "MemoryPeak": "13586432", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3290181632", "EffectiveMemoryMax": "3702865920", "EffectiveMemoryHigh": "3702865920", "CPUUsageNSec": "1800640000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service shutdown.target NetworkManager-wait-online.service multi-user.target network.target", "After": "dbus-broker.service systemd-journald.socket network-pre.target basic.target cloud-init-local.service dbus.socket system.slice sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:49:45 EDT", "StateChangeTimestampMonotonic": "362725592", "InactiveExitTimestamp": "Tue 2024-09-24 14:44:11 EDT", "InactiveExitTimestampMonotonic": "28990654", "ActiveEnterTimestamp": "Tue 2024-09-24 14:44:12 EDT", "ActiveEnterTimestampMonotonic": "29769382", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:44:11 EDT", "ConditionTimestampMonotonic": "28989295", "AssertTimestamp": "Tue 2024-09-24 14:44:11 EDT", "AssertTimestampMonotonic": "28989297", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "70a845f8a1964db89963090ed497f47f", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 41445 1727204194.99717: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. <<< 41445 1727204194.99721: stdout chunk (state=3): >>><<< 41445 1727204194.99723: stderr chunk (state=3): >>><<< 41445 1727204194.99740: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "704", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:44:11 EDT", "ExecMainStartTimestampMonotonic": "28990148", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Tue 2024-09-24 14:44:11 EDT", "ExecMainHandoffTimestampMonotonic": "29005881", "ExecMainPID": "704", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10506240", "MemoryPeak": "13586432", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3290181632", "EffectiveMemoryMax": "3702865920", "EffectiveMemoryHigh": "3702865920", "CPUUsageNSec": "1800640000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service shutdown.target NetworkManager-wait-online.service multi-user.target network.target", "After": "dbus-broker.service systemd-journald.socket network-pre.target basic.target cloud-init-local.service dbus.socket system.slice sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:49:45 EDT", "StateChangeTimestampMonotonic": "362725592", "InactiveExitTimestamp": "Tue 2024-09-24 14:44:11 EDT", "InactiveExitTimestampMonotonic": "28990654", "ActiveEnterTimestamp": "Tue 2024-09-24 14:44:12 EDT", "ActiveEnterTimestampMonotonic": "29769382", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:44:11 EDT", "ConditionTimestampMonotonic": "28989295", "AssertTimestamp": "Tue 2024-09-24 14:44:11 EDT", "AssertTimestampMonotonic": "28989297", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "70a845f8a1964db89963090ed497f47f", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. 41445 1727204194.99946: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204194.0846395-42588-245560582699898/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41445 1727204195.00045: _low_level_execute_command(): starting 41445 1727204195.00049: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204194.0846395-42588-245560582699898/ > /dev/null 2>&1 && sleep 0' 41445 1727204195.00651: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204195.00665: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204195.00695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204195.00750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204195.00817: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204195.00835: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204195.00894: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204195.00935: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204195.02771: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204195.02795: stderr chunk (state=3): >>><<< 41445 1727204195.02805: stdout chunk (state=3): >>><<< 41445 1727204195.02994: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204195.02997: handler run complete 41445 1727204195.03000: attempt loop complete, returning result 41445 1727204195.03002: _execute() done 41445 1727204195.03004: dumping result to json 41445 1727204195.03006: done dumping result, returning 41445 1727204195.03009: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [028d2410-947f-bf02-eee4-000000000022] 41445 1727204195.03011: sending task result for task 028d2410-947f-bf02-eee4-000000000022 41445 1727204195.03632: done sending task result for task 028d2410-947f-bf02-eee4-000000000022 41445 1727204195.03636: WORKER PROCESS EXITING ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 41445 1727204195.03693: no more pending results, returning what we have 41445 1727204195.03697: results queue empty 41445 1727204195.03698: checking for any_errors_fatal 41445 1727204195.03706: done checking for any_errors_fatal 41445 1727204195.03707: checking for max_fail_percentage 41445 1727204195.03709: done checking for max_fail_percentage 41445 1727204195.03710: checking to see if all hosts have failed and the running result is not ok 41445 1727204195.03711: done checking to see if all hosts have failed 41445 1727204195.03712: getting the remaining hosts for this loop 41445 1727204195.03713: done getting the remaining hosts for this loop 41445 1727204195.03717: getting the next task for host managed-node3 41445 1727204195.03724: done getting next task for host managed-node3 41445 1727204195.03728: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 41445 1727204195.03731: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204195.03741: getting variables 41445 1727204195.03743: in VariableManager get_vars() 41445 1727204195.03817: Calling all_inventory to load vars for managed-node3 41445 1727204195.03820: Calling groups_inventory to load vars for managed-node3 41445 1727204195.03822: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204195.03832: Calling all_plugins_play to load vars for managed-node3 41445 1727204195.03835: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204195.03838: Calling groups_plugins_play to load vars for managed-node3 41445 1727204195.06101: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204195.08351: done with get_vars() 41445 1727204195.08380: done getting variables 41445 1727204195.08446: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:56:35 -0400 (0:00:01.185) 0:00:13.872 ***** 41445 1727204195.08483: entering _queue_task() for managed-node3/service 41445 1727204195.08822: worker is 1 (out of 1 available) 41445 1727204195.08835: exiting _queue_task() for managed-node3/service 41445 1727204195.08847: done queuing things up, now waiting for results queue to drain 41445 1727204195.08848: waiting for pending results... 41445 1727204195.09262: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 41445 1727204195.09620: in run() - task 028d2410-947f-bf02-eee4-000000000023 41445 1727204195.09623: variable 'ansible_search_path' from source: unknown 41445 1727204195.09625: variable 'ansible_search_path' from source: unknown 41445 1727204195.09980: calling self._execute() 41445 1727204195.09985: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204195.09988: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204195.09991: variable 'omit' from source: magic vars 41445 1727204195.10618: variable 'ansible_distribution_major_version' from source: facts 41445 1727204195.10774: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204195.11008: variable 'network_provider' from source: set_fact 41445 1727204195.11020: Evaluated conditional (network_provider == "nm"): True 41445 1727204195.11221: variable '__network_wpa_supplicant_required' from source: role '' defaults 41445 1727204195.11428: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 41445 1727204195.11755: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41445 1727204195.15972: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41445 1727204195.16045: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41445 1727204195.16383: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41445 1727204195.16387: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41445 1727204195.16391: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41445 1727204195.16602: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204195.16639: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204195.16668: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204195.16721: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204195.16741: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204195.16865: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204195.16897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204195.16952: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204195.17075: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204195.17098: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204195.17188: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204195.17281: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204195.17312: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204195.17400: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204195.17421: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204195.17591: variable 'network_connections' from source: task vars 41445 1727204195.17610: variable 'interface' from source: set_fact 41445 1727204195.17693: variable 'interface' from source: set_fact 41445 1727204195.17709: variable 'interface' from source: set_fact 41445 1727204195.17774: variable 'interface' from source: set_fact 41445 1727204195.17863: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41445 1727204195.18053: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41445 1727204195.18097: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41445 1727204195.18139: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41445 1727204195.18173: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41445 1727204195.18223: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41445 1727204195.18256: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41445 1727204195.18290: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204195.18325: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41445 1727204195.18386: variable '__network_wireless_connections_defined' from source: role '' defaults 41445 1727204195.18835: variable 'network_connections' from source: task vars 41445 1727204195.18846: variable 'interface' from source: set_fact 41445 1727204195.18917: variable 'interface' from source: set_fact 41445 1727204195.18929: variable 'interface' from source: set_fact 41445 1727204195.18998: variable 'interface' from source: set_fact 41445 1727204195.19050: Evaluated conditional (__network_wpa_supplicant_required): False 41445 1727204195.19107: when evaluation is False, skipping this task 41445 1727204195.19110: _execute() done 41445 1727204195.19124: dumping result to json 41445 1727204195.19127: done dumping result, returning 41445 1727204195.19129: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [028d2410-947f-bf02-eee4-000000000023] 41445 1727204195.19131: sending task result for task 028d2410-947f-bf02-eee4-000000000023 skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 41445 1727204195.19341: no more pending results, returning what we have 41445 1727204195.19345: results queue empty 41445 1727204195.19346: checking for any_errors_fatal 41445 1727204195.19372: done checking for any_errors_fatal 41445 1727204195.19373: checking for max_fail_percentage 41445 1727204195.19378: done checking for max_fail_percentage 41445 1727204195.19379: checking to see if all hosts have failed and the running result is not ok 41445 1727204195.19380: done checking to see if all hosts have failed 41445 1727204195.19380: getting the remaining hosts for this loop 41445 1727204195.19382: done getting the remaining hosts for this loop 41445 1727204195.19386: getting the next task for host managed-node3 41445 1727204195.19395: done getting next task for host managed-node3 41445 1727204195.19399: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 41445 1727204195.19402: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204195.19417: getting variables 41445 1727204195.19419: in VariableManager get_vars() 41445 1727204195.19464: Calling all_inventory to load vars for managed-node3 41445 1727204195.19467: Calling groups_inventory to load vars for managed-node3 41445 1727204195.19470: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204195.19687: Calling all_plugins_play to load vars for managed-node3 41445 1727204195.19690: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204195.19695: Calling groups_plugins_play to load vars for managed-node3 41445 1727204195.20300: done sending task result for task 028d2410-947f-bf02-eee4-000000000023 41445 1727204195.20304: WORKER PROCESS EXITING 41445 1727204195.21245: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204195.22865: done with get_vars() 41445 1727204195.22896: done getting variables 41445 1727204195.22966: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:56:35 -0400 (0:00:00.145) 0:00:14.017 ***** 41445 1727204195.22998: entering _queue_task() for managed-node3/service 41445 1727204195.23329: worker is 1 (out of 1 available) 41445 1727204195.23342: exiting _queue_task() for managed-node3/service 41445 1727204195.23354: done queuing things up, now waiting for results queue to drain 41445 1727204195.23356: waiting for pending results... 41445 1727204195.23642: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service 41445 1727204195.23779: in run() - task 028d2410-947f-bf02-eee4-000000000024 41445 1727204195.23809: variable 'ansible_search_path' from source: unknown 41445 1727204195.23817: variable 'ansible_search_path' from source: unknown 41445 1727204195.23853: calling self._execute() 41445 1727204195.23961: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204195.23972: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204195.23989: variable 'omit' from source: magic vars 41445 1727204195.24371: variable 'ansible_distribution_major_version' from source: facts 41445 1727204195.24389: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204195.24507: variable 'network_provider' from source: set_fact 41445 1727204195.24519: Evaluated conditional (network_provider == "initscripts"): False 41445 1727204195.24526: when evaluation is False, skipping this task 41445 1727204195.24534: _execute() done 41445 1727204195.24542: dumping result to json 41445 1727204195.24549: done dumping result, returning 41445 1727204195.24566: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service [028d2410-947f-bf02-eee4-000000000024] 41445 1727204195.24581: sending task result for task 028d2410-947f-bf02-eee4-000000000024 skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 41445 1727204195.24838: no more pending results, returning what we have 41445 1727204195.24842: results queue empty 41445 1727204195.24843: checking for any_errors_fatal 41445 1727204195.24856: done checking for any_errors_fatal 41445 1727204195.24857: checking for max_fail_percentage 41445 1727204195.24859: done checking for max_fail_percentage 41445 1727204195.24860: checking to see if all hosts have failed and the running result is not ok 41445 1727204195.24861: done checking to see if all hosts have failed 41445 1727204195.24862: getting the remaining hosts for this loop 41445 1727204195.24863: done getting the remaining hosts for this loop 41445 1727204195.24867: getting the next task for host managed-node3 41445 1727204195.24874: done getting next task for host managed-node3 41445 1727204195.24880: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 41445 1727204195.24884: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204195.24902: getting variables 41445 1727204195.24904: in VariableManager get_vars() 41445 1727204195.24949: Calling all_inventory to load vars for managed-node3 41445 1727204195.24952: Calling groups_inventory to load vars for managed-node3 41445 1727204195.24955: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204195.24967: Calling all_plugins_play to load vars for managed-node3 41445 1727204195.24970: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204195.24974: Calling groups_plugins_play to load vars for managed-node3 41445 1727204195.25708: done sending task result for task 028d2410-947f-bf02-eee4-000000000024 41445 1727204195.25711: WORKER PROCESS EXITING 41445 1727204195.26498: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204195.28232: done with get_vars() 41445 1727204195.28257: done getting variables 41445 1727204195.28532: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:56:35 -0400 (0:00:00.055) 0:00:14.073 ***** 41445 1727204195.28567: entering _queue_task() for managed-node3/copy 41445 1727204195.29272: worker is 1 (out of 1 available) 41445 1727204195.29285: exiting _queue_task() for managed-node3/copy 41445 1727204195.29295: done queuing things up, now waiting for results queue to drain 41445 1727204195.29296: waiting for pending results... 41445 1727204195.29501: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 41445 1727204195.29648: in run() - task 028d2410-947f-bf02-eee4-000000000025 41445 1727204195.29670: variable 'ansible_search_path' from source: unknown 41445 1727204195.29680: variable 'ansible_search_path' from source: unknown 41445 1727204195.29723: calling self._execute() 41445 1727204195.29825: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204195.29844: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204195.29860: variable 'omit' from source: magic vars 41445 1727204195.30243: variable 'ansible_distribution_major_version' from source: facts 41445 1727204195.30261: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204195.30389: variable 'network_provider' from source: set_fact 41445 1727204195.30400: Evaluated conditional (network_provider == "initscripts"): False 41445 1727204195.30407: when evaluation is False, skipping this task 41445 1727204195.30413: _execute() done 41445 1727204195.30420: dumping result to json 41445 1727204195.30426: done dumping result, returning 41445 1727204195.30438: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [028d2410-947f-bf02-eee4-000000000025] 41445 1727204195.30480: sending task result for task 028d2410-947f-bf02-eee4-000000000025 skipping: [managed-node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 41445 1727204195.30611: no more pending results, returning what we have 41445 1727204195.30616: results queue empty 41445 1727204195.30616: checking for any_errors_fatal 41445 1727204195.30624: done checking for any_errors_fatal 41445 1727204195.30624: checking for max_fail_percentage 41445 1727204195.30626: done checking for max_fail_percentage 41445 1727204195.30627: checking to see if all hosts have failed and the running result is not ok 41445 1727204195.30628: done checking to see if all hosts have failed 41445 1727204195.30628: getting the remaining hosts for this loop 41445 1727204195.30630: done getting the remaining hosts for this loop 41445 1727204195.30633: getting the next task for host managed-node3 41445 1727204195.30640: done getting next task for host managed-node3 41445 1727204195.30643: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 41445 1727204195.30646: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204195.30659: getting variables 41445 1727204195.30661: in VariableManager get_vars() 41445 1727204195.30708: Calling all_inventory to load vars for managed-node3 41445 1727204195.30711: Calling groups_inventory to load vars for managed-node3 41445 1727204195.30713: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204195.30724: Calling all_plugins_play to load vars for managed-node3 41445 1727204195.30727: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204195.30729: Calling groups_plugins_play to load vars for managed-node3 41445 1727204195.31289: done sending task result for task 028d2410-947f-bf02-eee4-000000000025 41445 1727204195.31292: WORKER PROCESS EXITING 41445 1727204195.32100: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204195.33434: done with get_vars() 41445 1727204195.33457: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:56:35 -0400 (0:00:00.049) 0:00:14.123 ***** 41445 1727204195.33534: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 41445 1727204195.33536: Creating lock for fedora.linux_system_roles.network_connections 41445 1727204195.33793: worker is 1 (out of 1 available) 41445 1727204195.33808: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 41445 1727204195.33821: done queuing things up, now waiting for results queue to drain 41445 1727204195.33822: waiting for pending results... 41445 1727204195.34009: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 41445 1727204195.34109: in run() - task 028d2410-947f-bf02-eee4-000000000026 41445 1727204195.34124: variable 'ansible_search_path' from source: unknown 41445 1727204195.34129: variable 'ansible_search_path' from source: unknown 41445 1727204195.34156: calling self._execute() 41445 1727204195.34231: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204195.34235: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204195.34242: variable 'omit' from source: magic vars 41445 1727204195.34680: variable 'ansible_distribution_major_version' from source: facts 41445 1727204195.34684: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204195.34687: variable 'omit' from source: magic vars 41445 1727204195.34689: variable 'omit' from source: magic vars 41445 1727204195.34857: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41445 1727204195.37155: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41445 1727204195.37233: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41445 1727204195.37279: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41445 1727204195.37321: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41445 1727204195.37352: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41445 1727204195.37436: variable 'network_provider' from source: set_fact 41445 1727204195.37570: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204195.37621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204195.37882: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204195.37885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204195.37888: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204195.37890: variable 'omit' from source: magic vars 41445 1727204195.37893: variable 'omit' from source: magic vars 41445 1727204195.37998: variable 'network_connections' from source: task vars 41445 1727204195.38016: variable 'interface' from source: set_fact 41445 1727204195.38139: variable 'interface' from source: set_fact 41445 1727204195.38151: variable 'interface' from source: set_fact 41445 1727204195.38215: variable 'interface' from source: set_fact 41445 1727204195.38449: variable 'omit' from source: magic vars 41445 1727204195.38505: variable '__lsr_ansible_managed' from source: task vars 41445 1727204195.38567: variable '__lsr_ansible_managed' from source: task vars 41445 1727204195.38840: Loaded config def from plugin (lookup/template) 41445 1727204195.38850: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 41445 1727204195.38883: File lookup term: get_ansible_managed.j2 41445 1727204195.38891: variable 'ansible_search_path' from source: unknown 41445 1727204195.38902: evaluation_path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 41445 1727204195.38921: search_path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 41445 1727204195.38944: variable 'ansible_search_path' from source: unknown 41445 1727204195.46847: variable 'ansible_managed' from source: unknown 41445 1727204195.46998: variable 'omit' from source: magic vars 41445 1727204195.47041: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41445 1727204195.47073: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41445 1727204195.47098: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41445 1727204195.47121: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204195.47140: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204195.47171: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41445 1727204195.47184: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204195.47194: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204195.47295: Set connection var ansible_shell_executable to /bin/sh 41445 1727204195.47303: Set connection var ansible_shell_type to sh 41445 1727204195.47313: Set connection var ansible_pipelining to False 41445 1727204195.47324: Set connection var ansible_timeout to 10 41445 1727204195.47329: Set connection var ansible_connection to ssh 41445 1727204195.47340: Set connection var ansible_module_compression to ZIP_DEFLATED 41445 1727204195.47371: variable 'ansible_shell_executable' from source: unknown 41445 1727204195.47382: variable 'ansible_connection' from source: unknown 41445 1727204195.47388: variable 'ansible_module_compression' from source: unknown 41445 1727204195.47394: variable 'ansible_shell_type' from source: unknown 41445 1727204195.47400: variable 'ansible_shell_executable' from source: unknown 41445 1727204195.47408: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204195.47416: variable 'ansible_pipelining' from source: unknown 41445 1727204195.47424: variable 'ansible_timeout' from source: unknown 41445 1727204195.47432: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204195.47587: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 41445 1727204195.47610: variable 'omit' from source: magic vars 41445 1727204195.47623: starting attempt loop 41445 1727204195.47680: running the handler 41445 1727204195.47683: _low_level_execute_command(): starting 41445 1727204195.47685: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41445 1727204195.48322: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204195.48340: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204195.48357: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204195.48377: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204195.48397: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 <<< 41445 1727204195.48461: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204195.48507: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204195.48527: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204195.48557: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204195.48678: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204195.50412: stdout chunk (state=3): >>>/root <<< 41445 1727204195.50593: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204195.50625: stdout chunk (state=3): >>><<< 41445 1727204195.50629: stderr chunk (state=3): >>><<< 41445 1727204195.50632: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204195.50648: _low_level_execute_command(): starting 41445 1727204195.50653: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204195.50631-42634-229122211081350 `" && echo ansible-tmp-1727204195.50631-42634-229122211081350="` echo /root/.ansible/tmp/ansible-tmp-1727204195.50631-42634-229122211081350 `" ) && sleep 0' 41445 1727204195.51290: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204195.51383: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204195.51387: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204195.51389: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204195.51440: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204195.53292: stdout chunk (state=3): >>>ansible-tmp-1727204195.50631-42634-229122211081350=/root/.ansible/tmp/ansible-tmp-1727204195.50631-42634-229122211081350 <<< 41445 1727204195.53449: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204195.53452: stdout chunk (state=3): >>><<< 41445 1727204195.53454: stderr chunk (state=3): >>><<< 41445 1727204195.53684: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204195.50631-42634-229122211081350=/root/.ansible/tmp/ansible-tmp-1727204195.50631-42634-229122211081350 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204195.53687: variable 'ansible_module_compression' from source: unknown 41445 1727204195.53689: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 41445 1727204195.53691: ANSIBALLZ: Acquiring lock 41445 1727204195.53693: ANSIBALLZ: Lock acquired: 140182278010688 41445 1727204195.53695: ANSIBALLZ: Creating module 41445 1727204195.75209: ANSIBALLZ: Writing module into payload 41445 1727204195.75438: ANSIBALLZ: Writing module 41445 1727204195.75456: ANSIBALLZ: Renaming module 41445 1727204195.75461: ANSIBALLZ: Done creating module 41445 1727204195.75483: variable 'ansible_facts' from source: unknown 41445 1727204195.75550: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204195.50631-42634-229122211081350/AnsiballZ_network_connections.py 41445 1727204195.75660: Sending initial data 41445 1727204195.75663: Sent initial data (166 bytes) 41445 1727204195.76121: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204195.76128: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204195.76130: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration <<< 41445 1727204195.76133: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 <<< 41445 1727204195.76135: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204195.76173: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204195.76182: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204195.76194: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204195.76249: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204195.77879: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41445 1727204195.77904: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41445 1727204195.77939: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-414450s0ylvj1/tmpr7360z1k /root/.ansible/tmp/ansible-tmp-1727204195.50631-42634-229122211081350/AnsiballZ_network_connections.py <<< 41445 1727204195.77946: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204195.50631-42634-229122211081350/AnsiballZ_network_connections.py" <<< 41445 1727204195.77974: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-414450s0ylvj1/tmpr7360z1k" to remote "/root/.ansible/tmp/ansible-tmp-1727204195.50631-42634-229122211081350/AnsiballZ_network_connections.py" <<< 41445 1727204195.77978: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204195.50631-42634-229122211081350/AnsiballZ_network_connections.py" <<< 41445 1727204195.78655: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204195.78693: stderr chunk (state=3): >>><<< 41445 1727204195.78696: stdout chunk (state=3): >>><<< 41445 1727204195.78725: done transferring module to remote 41445 1727204195.78734: _low_level_execute_command(): starting 41445 1727204195.78738: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204195.50631-42634-229122211081350/ /root/.ansible/tmp/ansible-tmp-1727204195.50631-42634-229122211081350/AnsiballZ_network_connections.py && sleep 0' 41445 1727204195.79146: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204195.79154: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204195.79177: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204195.79180: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204195.79183: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204195.79231: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204195.79234: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204195.79282: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204195.81008: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204195.81034: stderr chunk (state=3): >>><<< 41445 1727204195.81038: stdout chunk (state=3): >>><<< 41445 1727204195.81054: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204195.81057: _low_level_execute_command(): starting 41445 1727204195.81061: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204195.50631-42634-229122211081350/AnsiballZ_network_connections.py && sleep 0' 41445 1727204195.81470: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204195.81511: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204195.81515: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204195.81517: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204195.81519: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 <<< 41445 1727204195.81526: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204195.81566: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204195.81569: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204195.81573: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204195.81609: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204196.22784: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, d37af1d3-4475-460d-968a-fd721e68b223\n[004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, d37af1d3-4475-460d-968a-fd721e68b223 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"dhcp4": false, "address": ["198.51.100.3/26"], "route": [{"network": "198.51.100.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2, "table": 30400}, {"network": "198.51.100.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4, "table": 30200}, {"network": "192.0.2.64", "prefix": 26, "gateway": "198.51.100.8", "metric": 50, "table": 30200, "src": "198.51.100.3"}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"dhcp4": false, "address": ["198.51.100.3/26"], "route": [{"network": "198.51.100.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2, "table": 30400}, {"network": "198.51.100.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4, "table": 30200}, {"network": "192.0.2.64", "prefix": 26, "gateway": "198.51.100.8", "metric": 50, "table": 30200, "src": "198.51.100.3"}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 41445 1727204196.24863: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. <<< 41445 1727204196.24868: stdout chunk (state=3): >>><<< 41445 1727204196.24884: stderr chunk (state=3): >>><<< 41445 1727204196.24911: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, d37af1d3-4475-460d-968a-fd721e68b223\n[004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, d37af1d3-4475-460d-968a-fd721e68b223 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"dhcp4": false, "address": ["198.51.100.3/26"], "route": [{"network": "198.51.100.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2, "table": 30400}, {"network": "198.51.100.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4, "table": 30200}, {"network": "192.0.2.64", "prefix": 26, "gateway": "198.51.100.8", "metric": 50, "table": 30200, "src": "198.51.100.3"}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"dhcp4": false, "address": ["198.51.100.3/26"], "route": [{"network": "198.51.100.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2, "table": 30400}, {"network": "198.51.100.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4, "table": 30200}, {"network": "192.0.2.64", "prefix": 26, "gateway": "198.51.100.8", "metric": 50, "table": 30200, "src": "198.51.100.3"}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. 41445 1727204196.24980: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'ethtest0', 'interface_name': 'ethtest0', 'state': 'up', 'type': 'ethernet', 'autoconnect': True, 'ip': {'dhcp4': False, 'address': ['198.51.100.3/26'], 'route': [{'network': '198.51.100.128', 'prefix': 26, 'gateway': '198.51.100.1', 'metric': 2, 'table': 30400}, {'network': '198.51.100.64', 'prefix': 26, 'gateway': '198.51.100.6', 'metric': 4, 'table': 30200}, {'network': '192.0.2.64', 'prefix': 26, 'gateway': '198.51.100.8', 'metric': 50, 'table': 30200, 'src': '198.51.100.3'}]}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204195.50631-42634-229122211081350/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41445 1727204196.24989: _low_level_execute_command(): starting 41445 1727204196.24994: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204195.50631-42634-229122211081350/ > /dev/null 2>&1 && sleep 0' 41445 1727204196.25735: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204196.25752: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204196.25755: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204196.25757: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204196.25788: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204196.27725: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204196.27882: stdout chunk (state=3): >>><<< 41445 1727204196.27885: stderr chunk (state=3): >>><<< 41445 1727204196.27887: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204196.27890: handler run complete 41445 1727204196.27892: attempt loop complete, returning result 41445 1727204196.27894: _execute() done 41445 1727204196.27896: dumping result to json 41445 1727204196.27897: done dumping result, returning 41445 1727204196.27899: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [028d2410-947f-bf02-eee4-000000000026] 41445 1727204196.27902: sending task result for task 028d2410-947f-bf02-eee4-000000000026 41445 1727204196.28331: done sending task result for task 028d2410-947f-bf02-eee4-000000000026 41445 1727204196.28334: WORKER PROCESS EXITING changed: [managed-node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": true, "interface_name": "ethtest0", "ip": { "address": [ "198.51.100.3/26" ], "dhcp4": false, "route": [ { "gateway": "198.51.100.1", "metric": 2, "network": "198.51.100.128", "prefix": 26, "table": 30400 }, { "gateway": "198.51.100.6", "metric": 4, "network": "198.51.100.64", "prefix": 26, "table": 30200 }, { "gateway": "198.51.100.8", "metric": 50, "network": "192.0.2.64", "prefix": 26, "src": "198.51.100.3", "table": 30200 } ] }, "name": "ethtest0", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [003] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, d37af1d3-4475-460d-968a-fd721e68b223 [004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, d37af1d3-4475-460d-968a-fd721e68b223 (not-active) 41445 1727204196.28493: no more pending results, returning what we have 41445 1727204196.28496: results queue empty 41445 1727204196.28497: checking for any_errors_fatal 41445 1727204196.28506: done checking for any_errors_fatal 41445 1727204196.28507: checking for max_fail_percentage 41445 1727204196.28508: done checking for max_fail_percentage 41445 1727204196.28509: checking to see if all hosts have failed and the running result is not ok 41445 1727204196.28510: done checking to see if all hosts have failed 41445 1727204196.28511: getting the remaining hosts for this loop 41445 1727204196.28512: done getting the remaining hosts for this loop 41445 1727204196.28515: getting the next task for host managed-node3 41445 1727204196.28522: done getting next task for host managed-node3 41445 1727204196.28525: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 41445 1727204196.28528: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204196.28538: getting variables 41445 1727204196.28540: in VariableManager get_vars() 41445 1727204196.28704: Calling all_inventory to load vars for managed-node3 41445 1727204196.28707: Calling groups_inventory to load vars for managed-node3 41445 1727204196.28709: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204196.28719: Calling all_plugins_play to load vars for managed-node3 41445 1727204196.28722: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204196.28724: Calling groups_plugins_play to load vars for managed-node3 41445 1727204196.30190: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204196.31818: done with get_vars() 41445 1727204196.31852: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:56:36 -0400 (0:00:00.984) 0:00:15.107 ***** 41445 1727204196.31938: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_state 41445 1727204196.31940: Creating lock for fedora.linux_system_roles.network_state 41445 1727204196.32293: worker is 1 (out of 1 available) 41445 1727204196.32307: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_state 41445 1727204196.32320: done queuing things up, now waiting for results queue to drain 41445 1727204196.32321: waiting for pending results... 41445 1727204196.32695: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state 41445 1727204196.32789: in run() - task 028d2410-947f-bf02-eee4-000000000027 41445 1727204196.32812: variable 'ansible_search_path' from source: unknown 41445 1727204196.32824: variable 'ansible_search_path' from source: unknown 41445 1727204196.32866: calling self._execute() 41445 1727204196.32969: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204196.32984: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204196.32999: variable 'omit' from source: magic vars 41445 1727204196.33483: variable 'ansible_distribution_major_version' from source: facts 41445 1727204196.33487: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204196.33545: variable 'network_state' from source: role '' defaults 41445 1727204196.33560: Evaluated conditional (network_state != {}): False 41445 1727204196.33566: when evaluation is False, skipping this task 41445 1727204196.33573: _execute() done 41445 1727204196.33582: dumping result to json 41445 1727204196.33597: done dumping result, returning 41445 1727204196.33608: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state [028d2410-947f-bf02-eee4-000000000027] 41445 1727204196.33622: sending task result for task 028d2410-947f-bf02-eee4-000000000027 41445 1727204196.33838: done sending task result for task 028d2410-947f-bf02-eee4-000000000027 41445 1727204196.33841: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 41445 1727204196.33895: no more pending results, returning what we have 41445 1727204196.33899: results queue empty 41445 1727204196.33900: checking for any_errors_fatal 41445 1727204196.33917: done checking for any_errors_fatal 41445 1727204196.33918: checking for max_fail_percentage 41445 1727204196.33921: done checking for max_fail_percentage 41445 1727204196.33922: checking to see if all hosts have failed and the running result is not ok 41445 1727204196.33922: done checking to see if all hosts have failed 41445 1727204196.33923: getting the remaining hosts for this loop 41445 1727204196.33924: done getting the remaining hosts for this loop 41445 1727204196.33928: getting the next task for host managed-node3 41445 1727204196.33935: done getting next task for host managed-node3 41445 1727204196.33939: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 41445 1727204196.33947: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204196.33965: getting variables 41445 1727204196.33966: in VariableManager get_vars() 41445 1727204196.34012: Calling all_inventory to load vars for managed-node3 41445 1727204196.34016: Calling groups_inventory to load vars for managed-node3 41445 1727204196.34018: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204196.34029: Calling all_plugins_play to load vars for managed-node3 41445 1727204196.34032: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204196.34035: Calling groups_plugins_play to load vars for managed-node3 41445 1727204196.35671: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204196.37311: done with get_vars() 41445 1727204196.37337: done getting variables 41445 1727204196.37405: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:56:36 -0400 (0:00:00.054) 0:00:15.162 ***** 41445 1727204196.37441: entering _queue_task() for managed-node3/debug 41445 1727204196.37785: worker is 1 (out of 1 available) 41445 1727204196.37798: exiting _queue_task() for managed-node3/debug 41445 1727204196.37811: done queuing things up, now waiting for results queue to drain 41445 1727204196.37812: waiting for pending results... 41445 1727204196.38098: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 41445 1727204196.38280: in run() - task 028d2410-947f-bf02-eee4-000000000028 41445 1727204196.38285: variable 'ansible_search_path' from source: unknown 41445 1727204196.38288: variable 'ansible_search_path' from source: unknown 41445 1727204196.38312: calling self._execute() 41445 1727204196.38421: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204196.38494: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204196.38498: variable 'omit' from source: magic vars 41445 1727204196.38854: variable 'ansible_distribution_major_version' from source: facts 41445 1727204196.38873: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204196.38888: variable 'omit' from source: magic vars 41445 1727204196.38953: variable 'omit' from source: magic vars 41445 1727204196.38995: variable 'omit' from source: magic vars 41445 1727204196.39049: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41445 1727204196.39090: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41445 1727204196.39118: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41445 1727204196.39180: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204196.39183: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204196.39198: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41445 1727204196.39207: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204196.39219: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204196.39331: Set connection var ansible_shell_executable to /bin/sh 41445 1727204196.39340: Set connection var ansible_shell_type to sh 41445 1727204196.39364: Set connection var ansible_pipelining to False 41445 1727204196.39370: Set connection var ansible_timeout to 10 41445 1727204196.39474: Set connection var ansible_connection to ssh 41445 1727204196.39479: Set connection var ansible_module_compression to ZIP_DEFLATED 41445 1727204196.39482: variable 'ansible_shell_executable' from source: unknown 41445 1727204196.39484: variable 'ansible_connection' from source: unknown 41445 1727204196.39486: variable 'ansible_module_compression' from source: unknown 41445 1727204196.39488: variable 'ansible_shell_type' from source: unknown 41445 1727204196.39490: variable 'ansible_shell_executable' from source: unknown 41445 1727204196.39492: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204196.39494: variable 'ansible_pipelining' from source: unknown 41445 1727204196.39496: variable 'ansible_timeout' from source: unknown 41445 1727204196.39498: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204196.39622: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41445 1727204196.39638: variable 'omit' from source: magic vars 41445 1727204196.39648: starting attempt loop 41445 1727204196.39654: running the handler 41445 1727204196.39787: variable '__network_connections_result' from source: set_fact 41445 1727204196.39854: handler run complete 41445 1727204196.39879: attempt loop complete, returning result 41445 1727204196.39887: _execute() done 41445 1727204196.39894: dumping result to json 41445 1727204196.39912: done dumping result, returning 41445 1727204196.39926: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [028d2410-947f-bf02-eee4-000000000028] 41445 1727204196.39936: sending task result for task 028d2410-947f-bf02-eee4-000000000028 ok: [managed-node3] => { "__network_connections_result.stderr_lines": [ "[003] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, d37af1d3-4475-460d-968a-fd721e68b223", "[004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, d37af1d3-4475-460d-968a-fd721e68b223 (not-active)" ] } 41445 1727204196.40188: no more pending results, returning what we have 41445 1727204196.40192: results queue empty 41445 1727204196.40193: checking for any_errors_fatal 41445 1727204196.40201: done checking for any_errors_fatal 41445 1727204196.40202: checking for max_fail_percentage 41445 1727204196.40204: done checking for max_fail_percentage 41445 1727204196.40205: checking to see if all hosts have failed and the running result is not ok 41445 1727204196.40206: done checking to see if all hosts have failed 41445 1727204196.40207: getting the remaining hosts for this loop 41445 1727204196.40208: done getting the remaining hosts for this loop 41445 1727204196.40214: getting the next task for host managed-node3 41445 1727204196.40223: done getting next task for host managed-node3 41445 1727204196.40227: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 41445 1727204196.40232: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204196.40243: getting variables 41445 1727204196.40245: in VariableManager get_vars() 41445 1727204196.40291: Calling all_inventory to load vars for managed-node3 41445 1727204196.40294: Calling groups_inventory to load vars for managed-node3 41445 1727204196.40296: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204196.40307: Calling all_plugins_play to load vars for managed-node3 41445 1727204196.40313: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204196.40316: Calling groups_plugins_play to load vars for managed-node3 41445 1727204196.40888: done sending task result for task 028d2410-947f-bf02-eee4-000000000028 41445 1727204196.40892: WORKER PROCESS EXITING 41445 1727204196.41954: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204196.43751: done with get_vars() 41445 1727204196.43773: done getting variables 41445 1727204196.43838: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:56:36 -0400 (0:00:00.064) 0:00:15.226 ***** 41445 1727204196.43874: entering _queue_task() for managed-node3/debug 41445 1727204196.44301: worker is 1 (out of 1 available) 41445 1727204196.44314: exiting _queue_task() for managed-node3/debug 41445 1727204196.44325: done queuing things up, now waiting for results queue to drain 41445 1727204196.44327: waiting for pending results... 41445 1727204196.44518: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 41445 1727204196.44674: in run() - task 028d2410-947f-bf02-eee4-000000000029 41445 1727204196.44698: variable 'ansible_search_path' from source: unknown 41445 1727204196.44707: variable 'ansible_search_path' from source: unknown 41445 1727204196.44751: calling self._execute() 41445 1727204196.44854: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204196.44865: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204196.44885: variable 'omit' from source: magic vars 41445 1727204196.45281: variable 'ansible_distribution_major_version' from source: facts 41445 1727204196.45306: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204196.45417: variable 'omit' from source: magic vars 41445 1727204196.45421: variable 'omit' from source: magic vars 41445 1727204196.45423: variable 'omit' from source: magic vars 41445 1727204196.45469: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41445 1727204196.45514: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41445 1727204196.45545: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41445 1727204196.45565: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204196.45584: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204196.45619: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41445 1727204196.45632: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204196.45644: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204196.45756: Set connection var ansible_shell_executable to /bin/sh 41445 1727204196.45766: Set connection var ansible_shell_type to sh 41445 1727204196.45778: Set connection var ansible_pipelining to False 41445 1727204196.45791: Set connection var ansible_timeout to 10 41445 1727204196.45797: Set connection var ansible_connection to ssh 41445 1727204196.45807: Set connection var ansible_module_compression to ZIP_DEFLATED 41445 1727204196.45851: variable 'ansible_shell_executable' from source: unknown 41445 1727204196.45856: variable 'ansible_connection' from source: unknown 41445 1727204196.45863: variable 'ansible_module_compression' from source: unknown 41445 1727204196.45865: variable 'ansible_shell_type' from source: unknown 41445 1727204196.45867: variable 'ansible_shell_executable' from source: unknown 41445 1727204196.45961: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204196.45964: variable 'ansible_pipelining' from source: unknown 41445 1727204196.45967: variable 'ansible_timeout' from source: unknown 41445 1727204196.45969: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204196.46040: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41445 1727204196.46055: variable 'omit' from source: magic vars 41445 1727204196.46070: starting attempt loop 41445 1727204196.46080: running the handler 41445 1727204196.46133: variable '__network_connections_result' from source: set_fact 41445 1727204196.46222: variable '__network_connections_result' from source: set_fact 41445 1727204196.46417: handler run complete 41445 1727204196.46456: attempt loop complete, returning result 41445 1727204196.46464: _execute() done 41445 1727204196.46481: dumping result to json 41445 1727204196.46484: done dumping result, returning 41445 1727204196.46516: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [028d2410-947f-bf02-eee4-000000000029] 41445 1727204196.46519: sending task result for task 028d2410-947f-bf02-eee4-000000000029 ok: [managed-node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": true, "interface_name": "ethtest0", "ip": { "address": [ "198.51.100.3/26" ], "dhcp4": false, "route": [ { "gateway": "198.51.100.1", "metric": 2, "network": "198.51.100.128", "prefix": 26, "table": 30400 }, { "gateway": "198.51.100.6", "metric": 4, "network": "198.51.100.64", "prefix": 26, "table": 30200 }, { "gateway": "198.51.100.8", "metric": 50, "network": "192.0.2.64", "prefix": 26, "src": "198.51.100.3", "table": 30200 } ] }, "name": "ethtest0", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[003] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, d37af1d3-4475-460d-968a-fd721e68b223\n[004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, d37af1d3-4475-460d-968a-fd721e68b223 (not-active)\n", "stderr_lines": [ "[003] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, d37af1d3-4475-460d-968a-fd721e68b223", "[004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, d37af1d3-4475-460d-968a-fd721e68b223 (not-active)" ] } } 41445 1727204196.46799: no more pending results, returning what we have 41445 1727204196.46803: results queue empty 41445 1727204196.46804: checking for any_errors_fatal 41445 1727204196.46815: done checking for any_errors_fatal 41445 1727204196.46817: checking for max_fail_percentage 41445 1727204196.46818: done checking for max_fail_percentage 41445 1727204196.46820: checking to see if all hosts have failed and the running result is not ok 41445 1727204196.46820: done checking to see if all hosts have failed 41445 1727204196.46821: getting the remaining hosts for this loop 41445 1727204196.46822: done getting the remaining hosts for this loop 41445 1727204196.46826: getting the next task for host managed-node3 41445 1727204196.46840: done getting next task for host managed-node3 41445 1727204196.46845: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 41445 1727204196.46848: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204196.46860: getting variables 41445 1727204196.46862: in VariableManager get_vars() 41445 1727204196.47104: Calling all_inventory to load vars for managed-node3 41445 1727204196.47107: Calling groups_inventory to load vars for managed-node3 41445 1727204196.47112: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204196.47121: Calling all_plugins_play to load vars for managed-node3 41445 1727204196.47124: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204196.47128: Calling groups_plugins_play to load vars for managed-node3 41445 1727204196.47692: done sending task result for task 028d2410-947f-bf02-eee4-000000000029 41445 1727204196.47695: WORKER PROCESS EXITING 41445 1727204196.48444: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204196.49331: done with get_vars() 41445 1727204196.49350: done getting variables 41445 1727204196.49394: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:56:36 -0400 (0:00:00.055) 0:00:15.281 ***** 41445 1727204196.49420: entering _queue_task() for managed-node3/debug 41445 1727204196.49654: worker is 1 (out of 1 available) 41445 1727204196.49667: exiting _queue_task() for managed-node3/debug 41445 1727204196.49679: done queuing things up, now waiting for results queue to drain 41445 1727204196.49681: waiting for pending results... 41445 1727204196.49862: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 41445 1727204196.49959: in run() - task 028d2410-947f-bf02-eee4-00000000002a 41445 1727204196.49970: variable 'ansible_search_path' from source: unknown 41445 1727204196.49974: variable 'ansible_search_path' from source: unknown 41445 1727204196.50004: calling self._execute() 41445 1727204196.50096: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204196.50100: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204196.50108: variable 'omit' from source: magic vars 41445 1727204196.50681: variable 'ansible_distribution_major_version' from source: facts 41445 1727204196.50686: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204196.50689: variable 'network_state' from source: role '' defaults 41445 1727204196.50691: Evaluated conditional (network_state != {}): False 41445 1727204196.50694: when evaluation is False, skipping this task 41445 1727204196.50696: _execute() done 41445 1727204196.50698: dumping result to json 41445 1727204196.50701: done dumping result, returning 41445 1727204196.50704: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [028d2410-947f-bf02-eee4-00000000002a] 41445 1727204196.50706: sending task result for task 028d2410-947f-bf02-eee4-00000000002a 41445 1727204196.50772: done sending task result for task 028d2410-947f-bf02-eee4-00000000002a skipping: [managed-node3] => { "false_condition": "network_state != {}" } 41445 1727204196.50840: no more pending results, returning what we have 41445 1727204196.50845: results queue empty 41445 1727204196.50846: checking for any_errors_fatal 41445 1727204196.50855: done checking for any_errors_fatal 41445 1727204196.50856: checking for max_fail_percentage 41445 1727204196.50858: done checking for max_fail_percentage 41445 1727204196.50859: checking to see if all hosts have failed and the running result is not ok 41445 1727204196.50860: done checking to see if all hosts have failed 41445 1727204196.50861: getting the remaining hosts for this loop 41445 1727204196.50862: done getting the remaining hosts for this loop 41445 1727204196.50866: getting the next task for host managed-node3 41445 1727204196.50873: done getting next task for host managed-node3 41445 1727204196.50880: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 41445 1727204196.50883: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204196.50899: getting variables 41445 1727204196.50901: in VariableManager get_vars() 41445 1727204196.50947: Calling all_inventory to load vars for managed-node3 41445 1727204196.50950: Calling groups_inventory to load vars for managed-node3 41445 1727204196.50953: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204196.50964: Calling all_plugins_play to load vars for managed-node3 41445 1727204196.50968: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204196.50972: Calling groups_plugins_play to load vars for managed-node3 41445 1727204196.51757: WORKER PROCESS EXITING 41445 1727204196.52742: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204196.55567: done with get_vars() 41445 1727204196.55713: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:56:36 -0400 (0:00:00.064) 0:00:15.346 ***** 41445 1727204196.55825: entering _queue_task() for managed-node3/ping 41445 1727204196.55827: Creating lock for ping 41445 1727204196.56253: worker is 1 (out of 1 available) 41445 1727204196.56267: exiting _queue_task() for managed-node3/ping 41445 1727204196.56346: done queuing things up, now waiting for results queue to drain 41445 1727204196.56348: waiting for pending results... 41445 1727204196.56545: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 41445 1727204196.56690: in run() - task 028d2410-947f-bf02-eee4-00000000002b 41445 1727204196.56705: variable 'ansible_search_path' from source: unknown 41445 1727204196.56708: variable 'ansible_search_path' from source: unknown 41445 1727204196.56748: calling self._execute() 41445 1727204196.56889: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204196.56892: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204196.56896: variable 'omit' from source: magic vars 41445 1727204196.57274: variable 'ansible_distribution_major_version' from source: facts 41445 1727204196.57288: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204196.57294: variable 'omit' from source: magic vars 41445 1727204196.57351: variable 'omit' from source: magic vars 41445 1727204196.57385: variable 'omit' from source: magic vars 41445 1727204196.57427: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41445 1727204196.57470: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41445 1727204196.57489: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41445 1727204196.57507: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204196.57520: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204196.57597: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41445 1727204196.57600: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204196.57603: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204196.57673: Set connection var ansible_shell_executable to /bin/sh 41445 1727204196.57678: Set connection var ansible_shell_type to sh 41445 1727204196.57682: Set connection var ansible_pipelining to False 41445 1727204196.57691: Set connection var ansible_timeout to 10 41445 1727204196.57693: Set connection var ansible_connection to ssh 41445 1727204196.57703: Set connection var ansible_module_compression to ZIP_DEFLATED 41445 1727204196.57816: variable 'ansible_shell_executable' from source: unknown 41445 1727204196.57820: variable 'ansible_connection' from source: unknown 41445 1727204196.57822: variable 'ansible_module_compression' from source: unknown 41445 1727204196.57824: variable 'ansible_shell_type' from source: unknown 41445 1727204196.57826: variable 'ansible_shell_executable' from source: unknown 41445 1727204196.57828: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204196.57830: variable 'ansible_pipelining' from source: unknown 41445 1727204196.57832: variable 'ansible_timeout' from source: unknown 41445 1727204196.57834: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204196.57957: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 41445 1727204196.57966: variable 'omit' from source: magic vars 41445 1727204196.57970: starting attempt loop 41445 1727204196.57979: running the handler 41445 1727204196.58022: _low_level_execute_command(): starting 41445 1727204196.58032: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41445 1727204196.58934: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204196.58937: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204196.58965: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204196.59030: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204196.60695: stdout chunk (state=3): >>>/root <<< 41445 1727204196.60987: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204196.60990: stdout chunk (state=3): >>><<< 41445 1727204196.60993: stderr chunk (state=3): >>><<< 41445 1727204196.60996: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204196.60999: _low_level_execute_command(): starting 41445 1727204196.61002: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204196.608564-42684-193516468913629 `" && echo ansible-tmp-1727204196.608564-42684-193516468913629="` echo /root/.ansible/tmp/ansible-tmp-1727204196.608564-42684-193516468913629 `" ) && sleep 0' 41445 1727204196.61452: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204196.61460: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204196.61471: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204196.61488: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204196.61502: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 <<< 41445 1727204196.61513: stderr chunk (state=3): >>>debug2: match not found <<< 41445 1727204196.61525: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204196.61536: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41445 1727204196.61632: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.22 is address <<< 41445 1727204196.61637: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41445 1727204196.61640: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204196.61642: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204196.61644: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204196.61647: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 <<< 41445 1727204196.61649: stderr chunk (state=3): >>>debug2: match found <<< 41445 1727204196.61651: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204196.61666: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204196.61686: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204196.61713: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204196.61768: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204196.63758: stdout chunk (state=3): >>>ansible-tmp-1727204196.608564-42684-193516468913629=/root/.ansible/tmp/ansible-tmp-1727204196.608564-42684-193516468913629 <<< 41445 1727204196.63762: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204196.63765: stderr chunk (state=3): >>><<< 41445 1727204196.63767: stdout chunk (state=3): >>><<< 41445 1727204196.63811: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204196.608564-42684-193516468913629=/root/.ansible/tmp/ansible-tmp-1727204196.608564-42684-193516468913629 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204196.63920: variable 'ansible_module_compression' from source: unknown 41445 1727204196.63923: ANSIBALLZ: Using lock for ping 41445 1727204196.63925: ANSIBALLZ: Acquiring lock 41445 1727204196.63927: ANSIBALLZ: Lock acquired: 140182278924064 41445 1727204196.63984: ANSIBALLZ: Creating module 41445 1727204196.87479: ANSIBALLZ: Writing module into payload 41445 1727204196.87582: ANSIBALLZ: Writing module 41445 1727204196.87585: ANSIBALLZ: Renaming module 41445 1727204196.87587: ANSIBALLZ: Done creating module 41445 1727204196.87589: variable 'ansible_facts' from source: unknown 41445 1727204196.87652: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204196.608564-42684-193516468913629/AnsiballZ_ping.py 41445 1727204196.87985: Sending initial data 41445 1727204196.87988: Sent initial data (152 bytes) 41445 1727204196.88384: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204196.88395: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204196.88406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204196.88419: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204196.88497: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204196.88526: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204196.88540: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204196.88543: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204196.88842: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204196.90453: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41445 1727204196.90486: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41445 1727204196.90523: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-414450s0ylvj1/tmpe6ec4c5a /root/.ansible/tmp/ansible-tmp-1727204196.608564-42684-193516468913629/AnsiballZ_ping.py <<< 41445 1727204196.90527: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204196.608564-42684-193516468913629/AnsiballZ_ping.py" <<< 41445 1727204196.90562: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-414450s0ylvj1/tmpe6ec4c5a" to remote "/root/.ansible/tmp/ansible-tmp-1727204196.608564-42684-193516468913629/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204196.608564-42684-193516468913629/AnsiballZ_ping.py" <<< 41445 1727204196.91824: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204196.91890: stderr chunk (state=3): >>><<< 41445 1727204196.91897: stdout chunk (state=3): >>><<< 41445 1727204196.91956: done transferring module to remote 41445 1727204196.91959: _low_level_execute_command(): starting 41445 1727204196.91961: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204196.608564-42684-193516468913629/ /root/.ansible/tmp/ansible-tmp-1727204196.608564-42684-193516468913629/AnsiballZ_ping.py && sleep 0' 41445 1727204196.92537: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204196.92583: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204196.92601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204196.92604: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204196.92606: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 <<< 41445 1727204196.92608: stderr chunk (state=3): >>>debug2: match not found <<< 41445 1727204196.92662: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204196.92693: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204196.92737: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204196.94685: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204196.94689: stdout chunk (state=3): >>><<< 41445 1727204196.94692: stderr chunk (state=3): >>><<< 41445 1727204196.94784: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204196.94788: _low_level_execute_command(): starting 41445 1727204196.94790: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204196.608564-42684-193516468913629/AnsiballZ_ping.py && sleep 0' 41445 1727204196.95688: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204196.95705: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204196.95782: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204196.95797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204196.95844: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204196.95871: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204196.95946: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204197.10919: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 41445 1727204197.12294: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. <<< 41445 1727204197.12326: stderr chunk (state=3): >>><<< 41445 1727204197.12334: stdout chunk (state=3): >>><<< 41445 1727204197.12346: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. 41445 1727204197.12366: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204196.608564-42684-193516468913629/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41445 1727204197.12374: _low_level_execute_command(): starting 41445 1727204197.12388: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204196.608564-42684-193516468913629/ > /dev/null 2>&1 && sleep 0' 41445 1727204197.12816: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204197.12819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204197.12847: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204197.12850: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204197.12856: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found <<< 41445 1727204197.12858: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204197.12907: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204197.12915: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204197.12950: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204197.14776: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204197.14783: stderr chunk (state=3): >>><<< 41445 1727204197.14786: stdout chunk (state=3): >>><<< 41445 1727204197.14801: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204197.14807: handler run complete 41445 1727204197.14821: attempt loop complete, returning result 41445 1727204197.14824: _execute() done 41445 1727204197.14827: dumping result to json 41445 1727204197.14829: done dumping result, returning 41445 1727204197.14841: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [028d2410-947f-bf02-eee4-00000000002b] 41445 1727204197.14843: sending task result for task 028d2410-947f-bf02-eee4-00000000002b 41445 1727204197.14932: done sending task result for task 028d2410-947f-bf02-eee4-00000000002b 41445 1727204197.14934: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "ping": "pong" } 41445 1727204197.14992: no more pending results, returning what we have 41445 1727204197.14995: results queue empty 41445 1727204197.14996: checking for any_errors_fatal 41445 1727204197.15003: done checking for any_errors_fatal 41445 1727204197.15003: checking for max_fail_percentage 41445 1727204197.15005: done checking for max_fail_percentage 41445 1727204197.15005: checking to see if all hosts have failed and the running result is not ok 41445 1727204197.15006: done checking to see if all hosts have failed 41445 1727204197.15007: getting the remaining hosts for this loop 41445 1727204197.15008: done getting the remaining hosts for this loop 41445 1727204197.15014: getting the next task for host managed-node3 41445 1727204197.15023: done getting next task for host managed-node3 41445 1727204197.15026: ^ task is: TASK: meta (role_complete) 41445 1727204197.15029: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204197.15038: getting variables 41445 1727204197.15040: in VariableManager get_vars() 41445 1727204197.15084: Calling all_inventory to load vars for managed-node3 41445 1727204197.15087: Calling groups_inventory to load vars for managed-node3 41445 1727204197.15089: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204197.15098: Calling all_plugins_play to load vars for managed-node3 41445 1727204197.15101: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204197.15105: Calling groups_plugins_play to load vars for managed-node3 41445 1727204197.16440: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204197.17322: done with get_vars() 41445 1727204197.17339: done getting variables 41445 1727204197.17400: done queuing things up, now waiting for results queue to drain 41445 1727204197.17401: results queue empty 41445 1727204197.17402: checking for any_errors_fatal 41445 1727204197.17404: done checking for any_errors_fatal 41445 1727204197.17404: checking for max_fail_percentage 41445 1727204197.17405: done checking for max_fail_percentage 41445 1727204197.17405: checking to see if all hosts have failed and the running result is not ok 41445 1727204197.17406: done checking to see if all hosts have failed 41445 1727204197.17406: getting the remaining hosts for this loop 41445 1727204197.17407: done getting the remaining hosts for this loop 41445 1727204197.17410: getting the next task for host managed-node3 41445 1727204197.17413: done getting next task for host managed-node3 41445 1727204197.17415: ^ task is: TASK: Get the routes from the route table 30200 41445 1727204197.17416: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204197.17417: getting variables 41445 1727204197.17418: in VariableManager get_vars() 41445 1727204197.17427: Calling all_inventory to load vars for managed-node3 41445 1727204197.17428: Calling groups_inventory to load vars for managed-node3 41445 1727204197.17430: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204197.17433: Calling all_plugins_play to load vars for managed-node3 41445 1727204197.17434: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204197.17436: Calling groups_plugins_play to load vars for managed-node3 41445 1727204197.18217: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204197.19482: done with get_vars() 41445 1727204197.19501: done getting variables 41445 1727204197.19533: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get the routes from the route table 30200] ******************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:56 Tuesday 24 September 2024 14:56:37 -0400 (0:00:00.637) 0:00:15.983 ***** 41445 1727204197.19553: entering _queue_task() for managed-node3/command 41445 1727204197.19839: worker is 1 (out of 1 available) 41445 1727204197.19854: exiting _queue_task() for managed-node3/command 41445 1727204197.19866: done queuing things up, now waiting for results queue to drain 41445 1727204197.19867: waiting for pending results... 41445 1727204197.20055: running TaskExecutor() for managed-node3/TASK: Get the routes from the route table 30200 41445 1727204197.20120: in run() - task 028d2410-947f-bf02-eee4-00000000005b 41445 1727204197.20132: variable 'ansible_search_path' from source: unknown 41445 1727204197.20161: calling self._execute() 41445 1727204197.20240: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204197.20244: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204197.20255: variable 'omit' from source: magic vars 41445 1727204197.20546: variable 'ansible_distribution_major_version' from source: facts 41445 1727204197.20557: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204197.20562: variable 'omit' from source: magic vars 41445 1727204197.20579: variable 'omit' from source: magic vars 41445 1727204197.20604: variable 'omit' from source: magic vars 41445 1727204197.20639: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41445 1727204197.20667: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41445 1727204197.20684: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41445 1727204197.20697: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204197.20707: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204197.20733: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41445 1727204197.20736: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204197.20740: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204197.20808: Set connection var ansible_shell_executable to /bin/sh 41445 1727204197.20811: Set connection var ansible_shell_type to sh 41445 1727204197.20818: Set connection var ansible_pipelining to False 41445 1727204197.20825: Set connection var ansible_timeout to 10 41445 1727204197.20827: Set connection var ansible_connection to ssh 41445 1727204197.20833: Set connection var ansible_module_compression to ZIP_DEFLATED 41445 1727204197.20858: variable 'ansible_shell_executable' from source: unknown 41445 1727204197.20861: variable 'ansible_connection' from source: unknown 41445 1727204197.20863: variable 'ansible_module_compression' from source: unknown 41445 1727204197.20866: variable 'ansible_shell_type' from source: unknown 41445 1727204197.20868: variable 'ansible_shell_executable' from source: unknown 41445 1727204197.20870: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204197.20872: variable 'ansible_pipelining' from source: unknown 41445 1727204197.20874: variable 'ansible_timeout' from source: unknown 41445 1727204197.20878: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204197.20979: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41445 1727204197.20989: variable 'omit' from source: magic vars 41445 1727204197.20992: starting attempt loop 41445 1727204197.20994: running the handler 41445 1727204197.21009: _low_level_execute_command(): starting 41445 1727204197.21018: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41445 1727204197.21528: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204197.21532: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204197.21536: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204197.21587: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204197.21590: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204197.21593: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204197.21635: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204197.23198: stdout chunk (state=3): >>>/root <<< 41445 1727204197.23302: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204197.23332: stderr chunk (state=3): >>><<< 41445 1727204197.23335: stdout chunk (state=3): >>><<< 41445 1727204197.23360: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204197.23371: _low_level_execute_command(): starting 41445 1727204197.23379: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204197.2335737-42707-147268599024146 `" && echo ansible-tmp-1727204197.2335737-42707-147268599024146="` echo /root/.ansible/tmp/ansible-tmp-1727204197.2335737-42707-147268599024146 `" ) && sleep 0' 41445 1727204197.23822: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204197.23825: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 <<< 41445 1727204197.23828: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration <<< 41445 1727204197.23838: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found <<< 41445 1727204197.23840: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204197.23883: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204197.23887: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204197.23905: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204197.23928: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204197.25755: stdout chunk (state=3): >>>ansible-tmp-1727204197.2335737-42707-147268599024146=/root/.ansible/tmp/ansible-tmp-1727204197.2335737-42707-147268599024146 <<< 41445 1727204197.25864: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204197.25891: stderr chunk (state=3): >>><<< 41445 1727204197.25894: stdout chunk (state=3): >>><<< 41445 1727204197.25912: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204197.2335737-42707-147268599024146=/root/.ansible/tmp/ansible-tmp-1727204197.2335737-42707-147268599024146 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204197.25938: variable 'ansible_module_compression' from source: unknown 41445 1727204197.25979: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-414450s0ylvj1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 41445 1727204197.26008: variable 'ansible_facts' from source: unknown 41445 1727204197.26065: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204197.2335737-42707-147268599024146/AnsiballZ_command.py 41445 1727204197.26167: Sending initial data 41445 1727204197.26171: Sent initial data (156 bytes) 41445 1727204197.26617: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204197.26620: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found <<< 41445 1727204197.26622: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204197.26625: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 <<< 41445 1727204197.26627: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204197.26671: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204197.26679: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204197.26682: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204197.26708: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204197.28201: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 41445 1727204197.28204: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41445 1727204197.28234: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41445 1727204197.28267: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-414450s0ylvj1/tmpx7_d2912 /root/.ansible/tmp/ansible-tmp-1727204197.2335737-42707-147268599024146/AnsiballZ_command.py <<< 41445 1727204197.28274: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204197.2335737-42707-147268599024146/AnsiballZ_command.py" <<< 41445 1727204197.28299: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-414450s0ylvj1/tmpx7_d2912" to remote "/root/.ansible/tmp/ansible-tmp-1727204197.2335737-42707-147268599024146/AnsiballZ_command.py" <<< 41445 1727204197.28308: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204197.2335737-42707-147268599024146/AnsiballZ_command.py" <<< 41445 1727204197.28802: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204197.28839: stderr chunk (state=3): >>><<< 41445 1727204197.28842: stdout chunk (state=3): >>><<< 41445 1727204197.28883: done transferring module to remote 41445 1727204197.28893: _low_level_execute_command(): starting 41445 1727204197.28896: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204197.2335737-42707-147268599024146/ /root/.ansible/tmp/ansible-tmp-1727204197.2335737-42707-147268599024146/AnsiballZ_command.py && sleep 0' 41445 1727204197.29330: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204197.29333: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found <<< 41445 1727204197.29335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204197.29337: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found <<< 41445 1727204197.29344: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204197.29380: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204197.29383: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204197.29424: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204197.31112: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204197.31137: stderr chunk (state=3): >>><<< 41445 1727204197.31140: stdout chunk (state=3): >>><<< 41445 1727204197.31156: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204197.31159: _low_level_execute_command(): starting 41445 1727204197.31164: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204197.2335737-42707-147268599024146/AnsiballZ_command.py && sleep 0' 41445 1727204197.31559: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204197.31597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204197.31600: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found <<< 41445 1727204197.31602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204197.31604: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204197.31606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204197.31650: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204197.31654: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204197.31700: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204197.46860: stdout chunk (state=3): >>> {"changed": true, "stdout": "192.0.2.64/26 via 198.51.100.8 dev ethtest0 proto static src 198.51.100.3 metric 50 \n198.51.100.64/26 via 198.51.100.6 dev ethtest0 proto static metric 4 ", "stderr": "", "rc": 0, "cmd": ["ip", "route", "show", "table", "30200"], "start": "2024-09-24 14:56:37.463838", "end": "2024-09-24 14:56:37.467525", "delta": "0:00:00.003687", "msg": "", "invocation": {"module_args": {"_raw_params": "ip route show table 30200", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 41445 1727204197.48286: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. <<< 41445 1727204197.48291: stdout chunk (state=3): >>><<< 41445 1727204197.48295: stderr chunk (state=3): >>><<< 41445 1727204197.48316: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "192.0.2.64/26 via 198.51.100.8 dev ethtest0 proto static src 198.51.100.3 metric 50 \n198.51.100.64/26 via 198.51.100.6 dev ethtest0 proto static metric 4 ", "stderr": "", "rc": 0, "cmd": ["ip", "route", "show", "table", "30200"], "start": "2024-09-24 14:56:37.463838", "end": "2024-09-24 14:56:37.467525", "delta": "0:00:00.003687", "msg": "", "invocation": {"module_args": {"_raw_params": "ip route show table 30200", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. 41445 1727204197.48346: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip route show table 30200', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204197.2335737-42707-147268599024146/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41445 1727204197.48353: _low_level_execute_command(): starting 41445 1727204197.48358: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204197.2335737-42707-147268599024146/ > /dev/null 2>&1 && sleep 0' 41445 1727204197.49182: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204197.49201: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204197.50767: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204197.50812: stderr chunk (state=3): >>><<< 41445 1727204197.50818: stdout chunk (state=3): >>><<< 41445 1727204197.50837: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204197.50848: handler run complete 41445 1727204197.50873: Evaluated conditional (False): False 41445 1727204197.50890: attempt loop complete, returning result 41445 1727204197.50897: _execute() done 41445 1727204197.50903: dumping result to json 41445 1727204197.50913: done dumping result, returning 41445 1727204197.50924: done running TaskExecutor() for managed-node3/TASK: Get the routes from the route table 30200 [028d2410-947f-bf02-eee4-00000000005b] 41445 1727204197.50940: sending task result for task 028d2410-947f-bf02-eee4-00000000005b ok: [managed-node3] => { "changed": false, "cmd": [ "ip", "route", "show", "table", "30200" ], "delta": "0:00:00.003687", "end": "2024-09-24 14:56:37.467525", "rc": 0, "start": "2024-09-24 14:56:37.463838" } STDOUT: 192.0.2.64/26 via 198.51.100.8 dev ethtest0 proto static src 198.51.100.3 metric 50 198.51.100.64/26 via 198.51.100.6 dev ethtest0 proto static metric 4 41445 1727204197.51133: no more pending results, returning what we have 41445 1727204197.51137: results queue empty 41445 1727204197.51138: checking for any_errors_fatal 41445 1727204197.51139: done checking for any_errors_fatal 41445 1727204197.51140: checking for max_fail_percentage 41445 1727204197.51142: done checking for max_fail_percentage 41445 1727204197.51143: checking to see if all hosts have failed and the running result is not ok 41445 1727204197.51144: done checking to see if all hosts have failed 41445 1727204197.51144: getting the remaining hosts for this loop 41445 1727204197.51146: done getting the remaining hosts for this loop 41445 1727204197.51149: getting the next task for host managed-node3 41445 1727204197.51155: done getting next task for host managed-node3 41445 1727204197.51160: ^ task is: TASK: Get the routes from the route table 30400 41445 1727204197.51162: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204197.51167: getting variables 41445 1727204197.51168: in VariableManager get_vars() 41445 1727204197.51511: Calling all_inventory to load vars for managed-node3 41445 1727204197.51514: Calling groups_inventory to load vars for managed-node3 41445 1727204197.51516: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204197.51529: Calling all_plugins_play to load vars for managed-node3 41445 1727204197.51532: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204197.51534: Calling groups_plugins_play to load vars for managed-node3 41445 1727204197.52089: done sending task result for task 028d2410-947f-bf02-eee4-00000000005b 41445 1727204197.52092: WORKER PROCESS EXITING 41445 1727204197.52989: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204197.54684: done with get_vars() 41445 1727204197.54706: done getting variables 41445 1727204197.54765: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get the routes from the route table 30400] ******************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:62 Tuesday 24 September 2024 14:56:37 -0400 (0:00:00.352) 0:00:16.335 ***** 41445 1727204197.54796: entering _queue_task() for managed-node3/command 41445 1727204197.55144: worker is 1 (out of 1 available) 41445 1727204197.55158: exiting _queue_task() for managed-node3/command 41445 1727204197.55170: done queuing things up, now waiting for results queue to drain 41445 1727204197.55172: waiting for pending results... 41445 1727204197.55448: running TaskExecutor() for managed-node3/TASK: Get the routes from the route table 30400 41445 1727204197.55561: in run() - task 028d2410-947f-bf02-eee4-00000000005c 41445 1727204197.55589: variable 'ansible_search_path' from source: unknown 41445 1727204197.55634: calling self._execute() 41445 1727204197.55742: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204197.55753: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204197.55767: variable 'omit' from source: magic vars 41445 1727204197.56174: variable 'ansible_distribution_major_version' from source: facts 41445 1727204197.56197: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204197.56227: variable 'omit' from source: magic vars 41445 1727204197.56328: variable 'omit' from source: magic vars 41445 1727204197.56482: variable 'omit' from source: magic vars 41445 1727204197.56598: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41445 1727204197.56602: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41445 1727204197.56604: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41445 1727204197.56607: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204197.56609: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204197.56650: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41445 1727204197.56659: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204197.56668: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204197.56791: Set connection var ansible_shell_executable to /bin/sh 41445 1727204197.56801: Set connection var ansible_shell_type to sh 41445 1727204197.56818: Set connection var ansible_pipelining to False 41445 1727204197.56838: Set connection var ansible_timeout to 10 41445 1727204197.56846: Set connection var ansible_connection to ssh 41445 1727204197.56858: Set connection var ansible_module_compression to ZIP_DEFLATED 41445 1727204197.56892: variable 'ansible_shell_executable' from source: unknown 41445 1727204197.56902: variable 'ansible_connection' from source: unknown 41445 1727204197.56910: variable 'ansible_module_compression' from source: unknown 41445 1727204197.56921: variable 'ansible_shell_type' from source: unknown 41445 1727204197.56933: variable 'ansible_shell_executable' from source: unknown 41445 1727204197.56945: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204197.57036: variable 'ansible_pipelining' from source: unknown 41445 1727204197.57039: variable 'ansible_timeout' from source: unknown 41445 1727204197.57043: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204197.57120: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41445 1727204197.57138: variable 'omit' from source: magic vars 41445 1727204197.57159: starting attempt loop 41445 1727204197.57168: running the handler 41445 1727204197.57256: _low_level_execute_command(): starting 41445 1727204197.57259: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41445 1727204197.57954: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204197.57992: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204197.58023: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41445 1727204197.58060: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204197.58081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204197.58157: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204197.58200: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204197.58348: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204197.59853: stdout chunk (state=3): >>>/root <<< 41445 1727204197.59989: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204197.59995: stdout chunk (state=3): >>><<< 41445 1727204197.60006: stderr chunk (state=3): >>><<< 41445 1727204197.60031: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204197.60084: _low_level_execute_command(): starting 41445 1727204197.60093: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204197.6003017-42717-237444439602521 `" && echo ansible-tmp-1727204197.6003017-42717-237444439602521="` echo /root/.ansible/tmp/ansible-tmp-1727204197.6003017-42717-237444439602521 `" ) && sleep 0' 41445 1727204197.60753: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204197.60756: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found <<< 41445 1727204197.60759: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204197.60822: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204197.60825: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204197.60882: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204197.60921: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204197.62747: stdout chunk (state=3): >>>ansible-tmp-1727204197.6003017-42717-237444439602521=/root/.ansible/tmp/ansible-tmp-1727204197.6003017-42717-237444439602521 <<< 41445 1727204197.62860: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204197.62879: stderr chunk (state=3): >>><<< 41445 1727204197.62883: stdout chunk (state=3): >>><<< 41445 1727204197.62899: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204197.6003017-42717-237444439602521=/root/.ansible/tmp/ansible-tmp-1727204197.6003017-42717-237444439602521 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204197.62928: variable 'ansible_module_compression' from source: unknown 41445 1727204197.62969: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-414450s0ylvj1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 41445 1727204197.62997: variable 'ansible_facts' from source: unknown 41445 1727204197.63055: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204197.6003017-42717-237444439602521/AnsiballZ_command.py 41445 1727204197.63156: Sending initial data 41445 1727204197.63160: Sent initial data (156 bytes) 41445 1727204197.63553: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204197.63585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204197.63588: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found <<< 41445 1727204197.63590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204197.63592: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204197.63594: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204197.63643: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204197.63646: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204197.63691: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204197.65261: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41445 1727204197.65300: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41445 1727204197.65334: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-414450s0ylvj1/tmpmbnnpl40 /root/.ansible/tmp/ansible-tmp-1727204197.6003017-42717-237444439602521/AnsiballZ_command.py <<< 41445 1727204197.65337: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204197.6003017-42717-237444439602521/AnsiballZ_command.py" <<< 41445 1727204197.65368: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-414450s0ylvj1/tmpmbnnpl40" to remote "/root/.ansible/tmp/ansible-tmp-1727204197.6003017-42717-237444439602521/AnsiballZ_command.py" <<< 41445 1727204197.65370: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204197.6003017-42717-237444439602521/AnsiballZ_command.py" <<< 41445 1727204197.65862: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204197.65903: stderr chunk (state=3): >>><<< 41445 1727204197.65906: stdout chunk (state=3): >>><<< 41445 1727204197.65944: done transferring module to remote 41445 1727204197.65953: _low_level_execute_command(): starting 41445 1727204197.65957: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204197.6003017-42717-237444439602521/ /root/.ansible/tmp/ansible-tmp-1727204197.6003017-42717-237444439602521/AnsiballZ_command.py && sleep 0' 41445 1727204197.66367: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204197.66374: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204197.66399: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204197.66402: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204197.66405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204197.66455: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204197.66476: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204197.66508: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204197.68515: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204197.68519: stderr chunk (state=3): >>><<< 41445 1727204197.68521: stdout chunk (state=3): >>><<< 41445 1727204197.68535: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204197.68538: _low_level_execute_command(): starting 41445 1727204197.68540: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204197.6003017-42717-237444439602521/AnsiballZ_command.py && sleep 0' 41445 1727204197.69078: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204197.69119: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204197.69122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204197.69124: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204197.69126: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204197.69128: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 <<< 41445 1727204197.69130: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204197.69178: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204197.69192: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204197.69228: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204197.84696: stdout chunk (state=3): >>> {"changed": true, "stdout": "198.51.100.128/26 via 198.51.100.1 dev ethtest0 proto static metric 2 ", "stderr": "", "rc": 0, "cmd": ["ip", "route", "show", "table", "30400"], "start": "2024-09-24 14:56:37.842507", "end": "2024-09-24 14:56:37.845827", "delta": "0:00:00.003320", "msg": "", "invocation": {"module_args": {"_raw_params": "ip route show table 30400", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 41445 1727204197.86183: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. <<< 41445 1727204197.86187: stdout chunk (state=3): >>><<< 41445 1727204197.86189: stderr chunk (state=3): >>><<< 41445 1727204197.86192: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "198.51.100.128/26 via 198.51.100.1 dev ethtest0 proto static metric 2 ", "stderr": "", "rc": 0, "cmd": ["ip", "route", "show", "table", "30400"], "start": "2024-09-24 14:56:37.842507", "end": "2024-09-24 14:56:37.845827", "delta": "0:00:00.003320", "msg": "", "invocation": {"module_args": {"_raw_params": "ip route show table 30400", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. 41445 1727204197.86194: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip route show table 30400', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204197.6003017-42717-237444439602521/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41445 1727204197.86196: _low_level_execute_command(): starting 41445 1727204197.86198: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204197.6003017-42717-237444439602521/ > /dev/null 2>&1 && sleep 0' 41445 1727204197.86794: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204197.86802: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204197.86814: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204197.86834: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204197.86846: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 <<< 41445 1727204197.86853: stderr chunk (state=3): >>>debug2: match not found <<< 41445 1727204197.86862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204197.86878: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41445 1727204197.86893: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.22 is address <<< 41445 1727204197.86900: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41445 1727204197.86908: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204197.87071: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204197.87083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204197.87086: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204197.87088: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204197.87090: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204197.87092: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204197.87122: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204197.88953: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204197.88957: stdout chunk (state=3): >>><<< 41445 1727204197.89181: stderr chunk (state=3): >>><<< 41445 1727204197.89184: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204197.89186: handler run complete 41445 1727204197.89188: Evaluated conditional (False): False 41445 1727204197.89190: attempt loop complete, returning result 41445 1727204197.89191: _execute() done 41445 1727204197.89193: dumping result to json 41445 1727204197.89194: done dumping result, returning 41445 1727204197.89196: done running TaskExecutor() for managed-node3/TASK: Get the routes from the route table 30400 [028d2410-947f-bf02-eee4-00000000005c] 41445 1727204197.89198: sending task result for task 028d2410-947f-bf02-eee4-00000000005c 41445 1727204197.89261: done sending task result for task 028d2410-947f-bf02-eee4-00000000005c 41445 1727204197.89263: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": [ "ip", "route", "show", "table", "30400" ], "delta": "0:00:00.003320", "end": "2024-09-24 14:56:37.845827", "rc": 0, "start": "2024-09-24 14:56:37.842507" } STDOUT: 198.51.100.128/26 via 198.51.100.1 dev ethtest0 proto static metric 2 41445 1727204197.89366: no more pending results, returning what we have 41445 1727204197.89370: results queue empty 41445 1727204197.89371: checking for any_errors_fatal 41445 1727204197.89384: done checking for any_errors_fatal 41445 1727204197.89385: checking for max_fail_percentage 41445 1727204197.89388: done checking for max_fail_percentage 41445 1727204197.89389: checking to see if all hosts have failed and the running result is not ok 41445 1727204197.89390: done checking to see if all hosts have failed 41445 1727204197.89391: getting the remaining hosts for this loop 41445 1727204197.89392: done getting the remaining hosts for this loop 41445 1727204197.89396: getting the next task for host managed-node3 41445 1727204197.89403: done getting next task for host managed-node3 41445 1727204197.89406: ^ task is: TASK: Assert that the route table 30200 contains the specified route 41445 1727204197.89408: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204197.89416: getting variables 41445 1727204197.89418: in VariableManager get_vars() 41445 1727204197.89464: Calling all_inventory to load vars for managed-node3 41445 1727204197.89467: Calling groups_inventory to load vars for managed-node3 41445 1727204197.89470: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204197.89671: Calling all_plugins_play to load vars for managed-node3 41445 1727204197.89677: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204197.89681: Calling groups_plugins_play to load vars for managed-node3 41445 1727204197.90700: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204197.91567: done with get_vars() 41445 1727204197.91585: done getting variables 41445 1727204197.91628: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert that the route table 30200 contains the specified route] ********** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:68 Tuesday 24 September 2024 14:56:37 -0400 (0:00:00.368) 0:00:16.704 ***** 41445 1727204197.91648: entering _queue_task() for managed-node3/assert 41445 1727204197.91889: worker is 1 (out of 1 available) 41445 1727204197.91906: exiting _queue_task() for managed-node3/assert 41445 1727204197.91919: done queuing things up, now waiting for results queue to drain 41445 1727204197.91920: waiting for pending results... 41445 1727204197.92131: running TaskExecutor() for managed-node3/TASK: Assert that the route table 30200 contains the specified route 41445 1727204197.92243: in run() - task 028d2410-947f-bf02-eee4-00000000005d 41445 1727204197.92264: variable 'ansible_search_path' from source: unknown 41445 1727204197.92307: calling self._execute() 41445 1727204197.92407: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204197.92581: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204197.92585: variable 'omit' from source: magic vars 41445 1727204197.92816: variable 'ansible_distribution_major_version' from source: facts 41445 1727204197.92835: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204197.92845: variable 'omit' from source: magic vars 41445 1727204197.92868: variable 'omit' from source: magic vars 41445 1727204197.92914: variable 'omit' from source: magic vars 41445 1727204197.92963: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41445 1727204197.93003: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41445 1727204197.93026: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41445 1727204197.93048: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204197.93067: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204197.93105: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41445 1727204197.93115: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204197.93123: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204197.93212: Set connection var ansible_shell_executable to /bin/sh 41445 1727204197.93216: Set connection var ansible_shell_type to sh 41445 1727204197.93252: Set connection var ansible_pipelining to False 41445 1727204197.93255: Set connection var ansible_timeout to 10 41445 1727204197.93259: Set connection var ansible_connection to ssh 41445 1727204197.93262: Set connection var ansible_module_compression to ZIP_DEFLATED 41445 1727204197.93265: variable 'ansible_shell_executable' from source: unknown 41445 1727204197.93267: variable 'ansible_connection' from source: unknown 41445 1727204197.93269: variable 'ansible_module_compression' from source: unknown 41445 1727204197.93272: variable 'ansible_shell_type' from source: unknown 41445 1727204197.93274: variable 'ansible_shell_executable' from source: unknown 41445 1727204197.93278: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204197.93280: variable 'ansible_pipelining' from source: unknown 41445 1727204197.93282: variable 'ansible_timeout' from source: unknown 41445 1727204197.93284: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204197.93390: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41445 1727204197.93398: variable 'omit' from source: magic vars 41445 1727204197.93411: starting attempt loop 41445 1727204197.93414: running the handler 41445 1727204197.93525: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41445 1727204197.93691: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41445 1727204197.93723: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41445 1727204197.93778: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41445 1727204197.93804: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41445 1727204197.93865: variable 'route_table_30200' from source: set_fact 41445 1727204197.93890: Evaluated conditional (route_table_30200.stdout is search("198.51.100.64/26 via 198.51.100.6 dev ethtest0 proto static metric 4")): True 41445 1727204197.93983: variable 'route_table_30200' from source: set_fact 41445 1727204197.94002: Evaluated conditional (route_table_30200.stdout is search("192.0.2.64/26 via 198.51.100.8 dev ethtest0 proto static src 198.51.100.3 metric 50")): True 41445 1727204197.94012: handler run complete 41445 1727204197.94020: attempt loop complete, returning result 41445 1727204197.94023: _execute() done 41445 1727204197.94026: dumping result to json 41445 1727204197.94028: done dumping result, returning 41445 1727204197.94034: done running TaskExecutor() for managed-node3/TASK: Assert that the route table 30200 contains the specified route [028d2410-947f-bf02-eee4-00000000005d] 41445 1727204197.94039: sending task result for task 028d2410-947f-bf02-eee4-00000000005d 41445 1727204197.94126: done sending task result for task 028d2410-947f-bf02-eee4-00000000005d 41445 1727204197.94129: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 41445 1727204197.94212: no more pending results, returning what we have 41445 1727204197.94215: results queue empty 41445 1727204197.94216: checking for any_errors_fatal 41445 1727204197.94223: done checking for any_errors_fatal 41445 1727204197.94224: checking for max_fail_percentage 41445 1727204197.94225: done checking for max_fail_percentage 41445 1727204197.94226: checking to see if all hosts have failed and the running result is not ok 41445 1727204197.94227: done checking to see if all hosts have failed 41445 1727204197.94228: getting the remaining hosts for this loop 41445 1727204197.94229: done getting the remaining hosts for this loop 41445 1727204197.94232: getting the next task for host managed-node3 41445 1727204197.94240: done getting next task for host managed-node3 41445 1727204197.94243: ^ task is: TASK: Assert that the route table 30400 contains the specified route 41445 1727204197.94245: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204197.94248: getting variables 41445 1727204197.94250: in VariableManager get_vars() 41445 1727204197.94286: Calling all_inventory to load vars for managed-node3 41445 1727204197.94288: Calling groups_inventory to load vars for managed-node3 41445 1727204197.94291: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204197.94300: Calling all_plugins_play to load vars for managed-node3 41445 1727204197.94303: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204197.94305: Calling groups_plugins_play to load vars for managed-node3 41445 1727204197.95205: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204198.01082: done with get_vars() 41445 1727204198.01107: done getting variables 41445 1727204198.01161: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert that the route table 30400 contains the specified route] ********** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:76 Tuesday 24 September 2024 14:56:38 -0400 (0:00:00.095) 0:00:16.799 ***** 41445 1727204198.01191: entering _queue_task() for managed-node3/assert 41445 1727204198.01540: worker is 1 (out of 1 available) 41445 1727204198.01552: exiting _queue_task() for managed-node3/assert 41445 1727204198.01564: done queuing things up, now waiting for results queue to drain 41445 1727204198.01565: waiting for pending results... 41445 1727204198.01995: running TaskExecutor() for managed-node3/TASK: Assert that the route table 30400 contains the specified route 41445 1727204198.02001: in run() - task 028d2410-947f-bf02-eee4-00000000005e 41445 1727204198.02004: variable 'ansible_search_path' from source: unknown 41445 1727204198.02036: calling self._execute() 41445 1727204198.02143: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204198.02155: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204198.02171: variable 'omit' from source: magic vars 41445 1727204198.02573: variable 'ansible_distribution_major_version' from source: facts 41445 1727204198.02594: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204198.02606: variable 'omit' from source: magic vars 41445 1727204198.02633: variable 'omit' from source: magic vars 41445 1727204198.02772: variable 'omit' from source: magic vars 41445 1727204198.02777: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41445 1727204198.02780: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41445 1727204198.02796: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41445 1727204198.02821: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204198.02838: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204198.02870: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41445 1727204198.02885: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204198.02893: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204198.03001: Set connection var ansible_shell_executable to /bin/sh 41445 1727204198.03013: Set connection var ansible_shell_type to sh 41445 1727204198.03025: Set connection var ansible_pipelining to False 41445 1727204198.03038: Set connection var ansible_timeout to 10 41445 1727204198.03045: Set connection var ansible_connection to ssh 41445 1727204198.03058: Set connection var ansible_module_compression to ZIP_DEFLATED 41445 1727204198.03089: variable 'ansible_shell_executable' from source: unknown 41445 1727204198.03107: variable 'ansible_connection' from source: unknown 41445 1727204198.03143: variable 'ansible_module_compression' from source: unknown 41445 1727204198.03208: variable 'ansible_shell_type' from source: unknown 41445 1727204198.03214: variable 'ansible_shell_executable' from source: unknown 41445 1727204198.03217: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204198.03219: variable 'ansible_pipelining' from source: unknown 41445 1727204198.03221: variable 'ansible_timeout' from source: unknown 41445 1727204198.03223: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204198.03335: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41445 1727204198.03351: variable 'omit' from source: magic vars 41445 1727204198.03363: starting attempt loop 41445 1727204198.03369: running the handler 41445 1727204198.03572: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41445 1727204198.03830: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41445 1727204198.03973: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41445 1727204198.03979: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41445 1727204198.03993: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41445 1727204198.04086: variable 'route_table_30400' from source: set_fact 41445 1727204198.04128: Evaluated conditional (route_table_30400.stdout is search("198.51.100.128/26 via 198.51.100.1 dev ethtest0 proto static metric 2")): True 41445 1727204198.04140: handler run complete 41445 1727204198.04158: attempt loop complete, returning result 41445 1727204198.04203: _execute() done 41445 1727204198.04207: dumping result to json 41445 1727204198.04212: done dumping result, returning 41445 1727204198.04214: done running TaskExecutor() for managed-node3/TASK: Assert that the route table 30400 contains the specified route [028d2410-947f-bf02-eee4-00000000005e] 41445 1727204198.04217: sending task result for task 028d2410-947f-bf02-eee4-00000000005e ok: [managed-node3] => { "changed": false } MSG: All assertions passed 41445 1727204198.04534: no more pending results, returning what we have 41445 1727204198.04538: results queue empty 41445 1727204198.04539: checking for any_errors_fatal 41445 1727204198.04548: done checking for any_errors_fatal 41445 1727204198.04549: checking for max_fail_percentage 41445 1727204198.04551: done checking for max_fail_percentage 41445 1727204198.04551: checking to see if all hosts have failed and the running result is not ok 41445 1727204198.04553: done checking to see if all hosts have failed 41445 1727204198.04553: getting the remaining hosts for this loop 41445 1727204198.04554: done getting the remaining hosts for this loop 41445 1727204198.04558: getting the next task for host managed-node3 41445 1727204198.04565: done getting next task for host managed-node3 41445 1727204198.04568: ^ task is: TASK: Create a dedicated test file in `/etc/iproute2/rt_tables.d/` and add a new routing table 41445 1727204198.04570: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204198.04573: getting variables 41445 1727204198.04577: in VariableManager get_vars() 41445 1727204198.04622: Calling all_inventory to load vars for managed-node3 41445 1727204198.04626: Calling groups_inventory to load vars for managed-node3 41445 1727204198.04628: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204198.04641: Calling all_plugins_play to load vars for managed-node3 41445 1727204198.04645: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204198.04648: Calling groups_plugins_play to load vars for managed-node3 41445 1727204198.05196: done sending task result for task 028d2410-947f-bf02-eee4-00000000005e 41445 1727204198.05199: WORKER PROCESS EXITING 41445 1727204198.06385: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204198.08596: done with get_vars() 41445 1727204198.08623: done getting variables TASK [Create a dedicated test file in `/etc/iproute2/rt_tables.d/` and add a new routing table] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:82 Tuesday 24 September 2024 14:56:38 -0400 (0:00:00.075) 0:00:16.875 ***** 41445 1727204198.08742: entering _queue_task() for managed-node3/lineinfile 41445 1727204198.08744: Creating lock for lineinfile 41445 1727204198.09106: worker is 1 (out of 1 available) 41445 1727204198.09120: exiting _queue_task() for managed-node3/lineinfile 41445 1727204198.09140: done queuing things up, now waiting for results queue to drain 41445 1727204198.09142: waiting for pending results... 41445 1727204198.09621: running TaskExecutor() for managed-node3/TASK: Create a dedicated test file in `/etc/iproute2/rt_tables.d/` and add a new routing table 41445 1727204198.09982: in run() - task 028d2410-947f-bf02-eee4-00000000005f 41445 1727204198.09993: variable 'ansible_search_path' from source: unknown 41445 1727204198.09996: calling self._execute() 41445 1727204198.09999: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204198.10007: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204198.10013: variable 'omit' from source: magic vars 41445 1727204198.10433: variable 'ansible_distribution_major_version' from source: facts 41445 1727204198.10437: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204198.10443: variable 'omit' from source: magic vars 41445 1727204198.10464: variable 'omit' from source: magic vars 41445 1727204198.10556: variable 'omit' from source: magic vars 41445 1727204198.10898: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41445 1727204198.10931: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41445 1727204198.10962: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41445 1727204198.10978: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204198.10991: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204198.11022: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41445 1727204198.11025: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204198.11027: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204198.11136: Set connection var ansible_shell_executable to /bin/sh 41445 1727204198.11139: Set connection var ansible_shell_type to sh 41445 1727204198.11141: Set connection var ansible_pipelining to False 41445 1727204198.11300: Set connection var ansible_timeout to 10 41445 1727204198.11304: Set connection var ansible_connection to ssh 41445 1727204198.11307: Set connection var ansible_module_compression to ZIP_DEFLATED 41445 1727204198.11342: variable 'ansible_shell_executable' from source: unknown 41445 1727204198.11345: variable 'ansible_connection' from source: unknown 41445 1727204198.11348: variable 'ansible_module_compression' from source: unknown 41445 1727204198.11350: variable 'ansible_shell_type' from source: unknown 41445 1727204198.11353: variable 'ansible_shell_executable' from source: unknown 41445 1727204198.11355: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204198.11357: variable 'ansible_pipelining' from source: unknown 41445 1727204198.11359: variable 'ansible_timeout' from source: unknown 41445 1727204198.11366: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204198.11632: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 41445 1727204198.11653: variable 'omit' from source: magic vars 41445 1727204198.11658: starting attempt loop 41445 1727204198.11660: running the handler 41445 1727204198.11663: _low_level_execute_command(): starting 41445 1727204198.11665: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41445 1727204198.12381: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204198.12385: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204198.12387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204198.12432: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204198.12464: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 <<< 41445 1727204198.12469: stderr chunk (state=3): >>>debug2: match not found <<< 41445 1727204198.12471: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204198.12473: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41445 1727204198.12574: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204198.12580: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204198.12620: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204198.14240: stdout chunk (state=3): >>>/root <<< 41445 1727204198.14633: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204198.14637: stdout chunk (state=3): >>><<< 41445 1727204198.14639: stderr chunk (state=3): >>><<< 41445 1727204198.14644: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204198.14880: _low_level_execute_command(): starting 41445 1727204198.14885: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204198.145297-42749-231515659904868 `" && echo ansible-tmp-1727204198.145297-42749-231515659904868="` echo /root/.ansible/tmp/ansible-tmp-1727204198.145297-42749-231515659904868 `" ) && sleep 0' 41445 1727204198.15686: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204198.15734: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204198.15808: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204198.17655: stdout chunk (state=3): >>>ansible-tmp-1727204198.145297-42749-231515659904868=/root/.ansible/tmp/ansible-tmp-1727204198.145297-42749-231515659904868 <<< 41445 1727204198.17789: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204198.17850: stderr chunk (state=3): >>><<< 41445 1727204198.17868: stdout chunk (state=3): >>><<< 41445 1727204198.17980: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204198.145297-42749-231515659904868=/root/.ansible/tmp/ansible-tmp-1727204198.145297-42749-231515659904868 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204198.18061: variable 'ansible_module_compression' from source: unknown 41445 1727204198.18254: ANSIBALLZ: Using lock for lineinfile 41445 1727204198.18281: ANSIBALLZ: Acquiring lock 41445 1727204198.18290: ANSIBALLZ: Lock acquired: 140182278001520 41445 1727204198.18298: ANSIBALLZ: Creating module 41445 1727204198.32645: ANSIBALLZ: Writing module into payload 41445 1727204198.32772: ANSIBALLZ: Writing module 41445 1727204198.32802: ANSIBALLZ: Renaming module 41445 1727204198.32819: ANSIBALLZ: Done creating module 41445 1727204198.32839: variable 'ansible_facts' from source: unknown 41445 1727204198.32928: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204198.145297-42749-231515659904868/AnsiballZ_lineinfile.py 41445 1727204198.33159: Sending initial data 41445 1727204198.33162: Sent initial data (158 bytes) 41445 1727204198.33732: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204198.33746: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204198.33798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204198.33812: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 <<< 41445 1727204198.33825: stderr chunk (state=3): >>>debug2: match found <<< 41445 1727204198.33911: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204198.33935: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204198.34013: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204198.35629: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41445 1727204198.35686: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41445 1727204198.35733: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-414450s0ylvj1/tmpwtwg2pml /root/.ansible/tmp/ansible-tmp-1727204198.145297-42749-231515659904868/AnsiballZ_lineinfile.py <<< 41445 1727204198.35737: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204198.145297-42749-231515659904868/AnsiballZ_lineinfile.py" <<< 41445 1727204198.35786: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-414450s0ylvj1/tmpwtwg2pml" to remote "/root/.ansible/tmp/ansible-tmp-1727204198.145297-42749-231515659904868/AnsiballZ_lineinfile.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204198.145297-42749-231515659904868/AnsiballZ_lineinfile.py" <<< 41445 1727204198.36531: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204198.36562: stderr chunk (state=3): >>><<< 41445 1727204198.36571: stdout chunk (state=3): >>><<< 41445 1727204198.36644: done transferring module to remote 41445 1727204198.36658: _low_level_execute_command(): starting 41445 1727204198.36666: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204198.145297-42749-231515659904868/ /root/.ansible/tmp/ansible-tmp-1727204198.145297-42749-231515659904868/AnsiballZ_lineinfile.py && sleep 0' 41445 1727204198.37388: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204198.37406: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204198.37485: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204198.37569: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204198.39391: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204198.39395: stdout chunk (state=3): >>><<< 41445 1727204198.39397: stderr chunk (state=3): >>><<< 41445 1727204198.39494: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204198.39497: _low_level_execute_command(): starting 41445 1727204198.39500: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204198.145297-42749-231515659904868/AnsiballZ_lineinfile.py && sleep 0' 41445 1727204198.40054: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204198.40073: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204198.40144: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204198.40207: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204198.40225: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204198.40270: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204198.40352: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204198.56555: stdout chunk (state=3): >>> {"changed": true, "msg": "line added", "backup": "", "diff": [{"before": "", "after": "", "before_header": "/etc/iproute2/rt_tables.d/table.conf (content)", "after_header": "/etc/iproute2/rt_tables.d/table.conf (content)"}, {"before_header": "/etc/iproute2/rt_tables.d/table.conf (file attributes)", "after_header": "/etc/iproute2/rt_tables.d/table.conf (file attributes)"}], "invocation": {"module_args": {"path": "/etc/iproute2/rt_tables.d/table.conf", "line": "200 custom", "mode": "0644", "create": true, "state": "present", "backrefs": false, "backup": false, "firstmatch": false, "unsafe_writes": false, "regexp": null, "search_string": null, "insertafter": null, "insertbefore": null, "validate": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 41445 1727204198.57906: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. <<< 41445 1727204198.57910: stderr chunk (state=3): >>><<< 41445 1727204198.57913: stdout chunk (state=3): >>><<< 41445 1727204198.57934: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "msg": "line added", "backup": "", "diff": [{"before": "", "after": "", "before_header": "/etc/iproute2/rt_tables.d/table.conf (content)", "after_header": "/etc/iproute2/rt_tables.d/table.conf (content)"}, {"before_header": "/etc/iproute2/rt_tables.d/table.conf (file attributes)", "after_header": "/etc/iproute2/rt_tables.d/table.conf (file attributes)"}], "invocation": {"module_args": {"path": "/etc/iproute2/rt_tables.d/table.conf", "line": "200 custom", "mode": "0644", "create": true, "state": "present", "backrefs": false, "backup": false, "firstmatch": false, "unsafe_writes": false, "regexp": null, "search_string": null, "insertafter": null, "insertbefore": null, "validate": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. 41445 1727204198.57961: done with _execute_module (lineinfile, {'path': '/etc/iproute2/rt_tables.d/table.conf', 'line': '200 custom', 'mode': '0644', 'create': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'lineinfile', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204198.145297-42749-231515659904868/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41445 1727204198.57969: _low_level_execute_command(): starting 41445 1727204198.57974: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204198.145297-42749-231515659904868/ > /dev/null 2>&1 && sleep 0' 41445 1727204198.58583: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204198.58586: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204198.58594: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204198.58648: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204198.60487: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204198.60508: stderr chunk (state=3): >>><<< 41445 1727204198.60541: stdout chunk (state=3): >>><<< 41445 1727204198.60581: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204198.60584: handler run complete 41445 1727204198.60651: attempt loop complete, returning result 41445 1727204198.60654: _execute() done 41445 1727204198.60657: dumping result to json 41445 1727204198.60659: done dumping result, returning 41445 1727204198.60661: done running TaskExecutor() for managed-node3/TASK: Create a dedicated test file in `/etc/iproute2/rt_tables.d/` and add a new routing table [028d2410-947f-bf02-eee4-00000000005f] 41445 1727204198.60663: sending task result for task 028d2410-947f-bf02-eee4-00000000005f 41445 1727204198.60751: done sending task result for task 028d2410-947f-bf02-eee4-00000000005f 41445 1727204198.60754: WORKER PROCESS EXITING changed: [managed-node3] => { "backup": "", "changed": true } MSG: line added 41445 1727204198.60838: no more pending results, returning what we have 41445 1727204198.60841: results queue empty 41445 1727204198.60842: checking for any_errors_fatal 41445 1727204198.60846: done checking for any_errors_fatal 41445 1727204198.60847: checking for max_fail_percentage 41445 1727204198.60849: done checking for max_fail_percentage 41445 1727204198.60849: checking to see if all hosts have failed and the running result is not ok 41445 1727204198.60850: done checking to see if all hosts have failed 41445 1727204198.60851: getting the remaining hosts for this loop 41445 1727204198.60852: done getting the remaining hosts for this loop 41445 1727204198.60855: getting the next task for host managed-node3 41445 1727204198.60864: done getting next task for host managed-node3 41445 1727204198.60869: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 41445 1727204198.60872: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204198.60892: getting variables 41445 1727204198.60894: in VariableManager get_vars() 41445 1727204198.60944: Calling all_inventory to load vars for managed-node3 41445 1727204198.60947: Calling groups_inventory to load vars for managed-node3 41445 1727204198.60949: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204198.60958: Calling all_plugins_play to load vars for managed-node3 41445 1727204198.60961: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204198.60963: Calling groups_plugins_play to load vars for managed-node3 41445 1727204198.63133: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204198.65346: done with get_vars() 41445 1727204198.65370: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:56:38 -0400 (0:00:00.567) 0:00:17.443 ***** 41445 1727204198.65543: entering _queue_task() for managed-node3/include_tasks 41445 1727204198.65910: worker is 1 (out of 1 available) 41445 1727204198.65924: exiting _queue_task() for managed-node3/include_tasks 41445 1727204198.65935: done queuing things up, now waiting for results queue to drain 41445 1727204198.65941: waiting for pending results... 41445 1727204198.66288: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 41445 1727204198.66467: in run() - task 028d2410-947f-bf02-eee4-000000000067 41445 1727204198.66472: variable 'ansible_search_path' from source: unknown 41445 1727204198.66475: variable 'ansible_search_path' from source: unknown 41445 1727204198.66481: calling self._execute() 41445 1727204198.66525: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204198.66529: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204198.66539: variable 'omit' from source: magic vars 41445 1727204198.66825: variable 'ansible_distribution_major_version' from source: facts 41445 1727204198.66839: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204198.66846: _execute() done 41445 1727204198.66849: dumping result to json 41445 1727204198.66851: done dumping result, returning 41445 1727204198.66857: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [028d2410-947f-bf02-eee4-000000000067] 41445 1727204198.66863: sending task result for task 028d2410-947f-bf02-eee4-000000000067 41445 1727204198.66958: done sending task result for task 028d2410-947f-bf02-eee4-000000000067 41445 1727204198.66961: WORKER PROCESS EXITING 41445 1727204198.67011: no more pending results, returning what we have 41445 1727204198.67017: in VariableManager get_vars() 41445 1727204198.67062: Calling all_inventory to load vars for managed-node3 41445 1727204198.67065: Calling groups_inventory to load vars for managed-node3 41445 1727204198.67068: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204198.67082: Calling all_plugins_play to load vars for managed-node3 41445 1727204198.67084: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204198.67087: Calling groups_plugins_play to load vars for managed-node3 41445 1727204198.67868: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204198.68745: done with get_vars() 41445 1727204198.68758: variable 'ansible_search_path' from source: unknown 41445 1727204198.68759: variable 'ansible_search_path' from source: unknown 41445 1727204198.68787: we have included files to process 41445 1727204198.68787: generating all_blocks data 41445 1727204198.68790: done generating all_blocks data 41445 1727204198.68794: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 41445 1727204198.68794: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 41445 1727204198.68796: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 41445 1727204198.69169: done processing included file 41445 1727204198.69170: iterating over new_blocks loaded from include file 41445 1727204198.69171: in VariableManager get_vars() 41445 1727204198.69189: done with get_vars() 41445 1727204198.69190: filtering new block on tags 41445 1727204198.69201: done filtering new block on tags 41445 1727204198.69203: in VariableManager get_vars() 41445 1727204198.69217: done with get_vars() 41445 1727204198.69218: filtering new block on tags 41445 1727204198.69230: done filtering new block on tags 41445 1727204198.69231: in VariableManager get_vars() 41445 1727204198.69247: done with get_vars() 41445 1727204198.69249: filtering new block on tags 41445 1727204198.69260: done filtering new block on tags 41445 1727204198.69261: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node3 41445 1727204198.69264: extending task lists for all hosts with included blocks 41445 1727204198.69710: done extending task lists 41445 1727204198.69711: done processing included files 41445 1727204198.69711: results queue empty 41445 1727204198.69712: checking for any_errors_fatal 41445 1727204198.69716: done checking for any_errors_fatal 41445 1727204198.69716: checking for max_fail_percentage 41445 1727204198.69717: done checking for max_fail_percentage 41445 1727204198.69717: checking to see if all hosts have failed and the running result is not ok 41445 1727204198.69718: done checking to see if all hosts have failed 41445 1727204198.69718: getting the remaining hosts for this loop 41445 1727204198.69719: done getting the remaining hosts for this loop 41445 1727204198.69720: getting the next task for host managed-node3 41445 1727204198.69723: done getting next task for host managed-node3 41445 1727204198.69725: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 41445 1727204198.69727: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204198.69733: getting variables 41445 1727204198.69734: in VariableManager get_vars() 41445 1727204198.69744: Calling all_inventory to load vars for managed-node3 41445 1727204198.69745: Calling groups_inventory to load vars for managed-node3 41445 1727204198.69747: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204198.69750: Calling all_plugins_play to load vars for managed-node3 41445 1727204198.69751: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204198.69753: Calling groups_plugins_play to load vars for managed-node3 41445 1727204198.70438: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204198.71290: done with get_vars() 41445 1727204198.71306: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:56:38 -0400 (0:00:00.058) 0:00:17.501 ***** 41445 1727204198.71355: entering _queue_task() for managed-node3/setup 41445 1727204198.71602: worker is 1 (out of 1 available) 41445 1727204198.71614: exiting _queue_task() for managed-node3/setup 41445 1727204198.71626: done queuing things up, now waiting for results queue to drain 41445 1727204198.71627: waiting for pending results... 41445 1727204198.71807: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 41445 1727204198.71919: in run() - task 028d2410-947f-bf02-eee4-0000000005df 41445 1727204198.71932: variable 'ansible_search_path' from source: unknown 41445 1727204198.71935: variable 'ansible_search_path' from source: unknown 41445 1727204198.71966: calling self._execute() 41445 1727204198.72041: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204198.72044: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204198.72054: variable 'omit' from source: magic vars 41445 1727204198.72335: variable 'ansible_distribution_major_version' from source: facts 41445 1727204198.72345: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204198.72494: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41445 1727204198.73933: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41445 1727204198.73984: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41445 1727204198.74012: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41445 1727204198.74043: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41445 1727204198.74063: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41445 1727204198.74125: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204198.74147: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204198.74164: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204198.74192: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204198.74203: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204198.74244: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204198.74261: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204198.74278: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204198.74304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204198.74317: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204198.74420: variable '__network_required_facts' from source: role '' defaults 41445 1727204198.74428: variable 'ansible_facts' from source: unknown 41445 1727204198.74965: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 41445 1727204198.74969: when evaluation is False, skipping this task 41445 1727204198.74971: _execute() done 41445 1727204198.74974: dumping result to json 41445 1727204198.74978: done dumping result, returning 41445 1727204198.74983: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [028d2410-947f-bf02-eee4-0000000005df] 41445 1727204198.74989: sending task result for task 028d2410-947f-bf02-eee4-0000000005df 41445 1727204198.75070: done sending task result for task 028d2410-947f-bf02-eee4-0000000005df 41445 1727204198.75073: WORKER PROCESS EXITING skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 41445 1727204198.75143: no more pending results, returning what we have 41445 1727204198.75146: results queue empty 41445 1727204198.75147: checking for any_errors_fatal 41445 1727204198.75148: done checking for any_errors_fatal 41445 1727204198.75149: checking for max_fail_percentage 41445 1727204198.75150: done checking for max_fail_percentage 41445 1727204198.75151: checking to see if all hosts have failed and the running result is not ok 41445 1727204198.75152: done checking to see if all hosts have failed 41445 1727204198.75152: getting the remaining hosts for this loop 41445 1727204198.75153: done getting the remaining hosts for this loop 41445 1727204198.75157: getting the next task for host managed-node3 41445 1727204198.75166: done getting next task for host managed-node3 41445 1727204198.75170: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 41445 1727204198.75173: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204198.75193: getting variables 41445 1727204198.75195: in VariableManager get_vars() 41445 1727204198.75233: Calling all_inventory to load vars for managed-node3 41445 1727204198.75235: Calling groups_inventory to load vars for managed-node3 41445 1727204198.75237: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204198.75246: Calling all_plugins_play to load vars for managed-node3 41445 1727204198.75249: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204198.75251: Calling groups_plugins_play to load vars for managed-node3 41445 1727204198.76036: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204198.77010: done with get_vars() 41445 1727204198.77028: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:56:38 -0400 (0:00:00.057) 0:00:17.558 ***** 41445 1727204198.77100: entering _queue_task() for managed-node3/stat 41445 1727204198.77328: worker is 1 (out of 1 available) 41445 1727204198.77340: exiting _queue_task() for managed-node3/stat 41445 1727204198.77353: done queuing things up, now waiting for results queue to drain 41445 1727204198.77355: waiting for pending results... 41445 1727204198.77537: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 41445 1727204198.77639: in run() - task 028d2410-947f-bf02-eee4-0000000005e1 41445 1727204198.77652: variable 'ansible_search_path' from source: unknown 41445 1727204198.77655: variable 'ansible_search_path' from source: unknown 41445 1727204198.77684: calling self._execute() 41445 1727204198.77757: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204198.77761: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204198.77769: variable 'omit' from source: magic vars 41445 1727204198.78047: variable 'ansible_distribution_major_version' from source: facts 41445 1727204198.78057: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204198.78170: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41445 1727204198.78358: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41445 1727204198.78391: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41445 1727204198.78417: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41445 1727204198.78443: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41445 1727204198.78507: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41445 1727204198.78526: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41445 1727204198.78544: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204198.78565: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41445 1727204198.78627: variable '__network_is_ostree' from source: set_fact 41445 1727204198.78633: Evaluated conditional (not __network_is_ostree is defined): False 41445 1727204198.78636: when evaluation is False, skipping this task 41445 1727204198.78638: _execute() done 41445 1727204198.78641: dumping result to json 41445 1727204198.78644: done dumping result, returning 41445 1727204198.78650: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [028d2410-947f-bf02-eee4-0000000005e1] 41445 1727204198.78655: sending task result for task 028d2410-947f-bf02-eee4-0000000005e1 41445 1727204198.78741: done sending task result for task 028d2410-947f-bf02-eee4-0000000005e1 41445 1727204198.78743: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 41445 1727204198.78832: no more pending results, returning what we have 41445 1727204198.78835: results queue empty 41445 1727204198.78836: checking for any_errors_fatal 41445 1727204198.78840: done checking for any_errors_fatal 41445 1727204198.78841: checking for max_fail_percentage 41445 1727204198.78843: done checking for max_fail_percentage 41445 1727204198.78843: checking to see if all hosts have failed and the running result is not ok 41445 1727204198.78844: done checking to see if all hosts have failed 41445 1727204198.78845: getting the remaining hosts for this loop 41445 1727204198.78846: done getting the remaining hosts for this loop 41445 1727204198.78849: getting the next task for host managed-node3 41445 1727204198.78854: done getting next task for host managed-node3 41445 1727204198.78857: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 41445 1727204198.78860: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204198.78877: getting variables 41445 1727204198.78879: in VariableManager get_vars() 41445 1727204198.78936: Calling all_inventory to load vars for managed-node3 41445 1727204198.78939: Calling groups_inventory to load vars for managed-node3 41445 1727204198.78941: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204198.78949: Calling all_plugins_play to load vars for managed-node3 41445 1727204198.78952: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204198.78954: Calling groups_plugins_play to load vars for managed-node3 41445 1727204198.79707: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204198.80913: done with get_vars() 41445 1727204198.80935: done getting variables 41445 1727204198.80996: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:56:38 -0400 (0:00:00.039) 0:00:17.598 ***** 41445 1727204198.81034: entering _queue_task() for managed-node3/set_fact 41445 1727204198.81359: worker is 1 (out of 1 available) 41445 1727204198.81371: exiting _queue_task() for managed-node3/set_fact 41445 1727204198.81386: done queuing things up, now waiting for results queue to drain 41445 1727204198.81387: waiting for pending results... 41445 1727204198.81573: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 41445 1727204198.81682: in run() - task 028d2410-947f-bf02-eee4-0000000005e2 41445 1727204198.81697: variable 'ansible_search_path' from source: unknown 41445 1727204198.81701: variable 'ansible_search_path' from source: unknown 41445 1727204198.81728: calling self._execute() 41445 1727204198.81803: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204198.81807: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204198.81816: variable 'omit' from source: magic vars 41445 1727204198.82092: variable 'ansible_distribution_major_version' from source: facts 41445 1727204198.82102: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204198.82219: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41445 1727204198.82412: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41445 1727204198.82444: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41445 1727204198.82471: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41445 1727204198.82499: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41445 1727204198.82566: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41445 1727204198.82584: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41445 1727204198.82604: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204198.82623: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41445 1727204198.82691: variable '__network_is_ostree' from source: set_fact 41445 1727204198.82696: Evaluated conditional (not __network_is_ostree is defined): False 41445 1727204198.82699: when evaluation is False, skipping this task 41445 1727204198.82702: _execute() done 41445 1727204198.82705: dumping result to json 41445 1727204198.82712: done dumping result, returning 41445 1727204198.82718: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [028d2410-947f-bf02-eee4-0000000005e2] 41445 1727204198.82723: sending task result for task 028d2410-947f-bf02-eee4-0000000005e2 41445 1727204198.82805: done sending task result for task 028d2410-947f-bf02-eee4-0000000005e2 41445 1727204198.82808: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 41445 1727204198.82857: no more pending results, returning what we have 41445 1727204198.82861: results queue empty 41445 1727204198.82861: checking for any_errors_fatal 41445 1727204198.82867: done checking for any_errors_fatal 41445 1727204198.82868: checking for max_fail_percentage 41445 1727204198.82869: done checking for max_fail_percentage 41445 1727204198.82870: checking to see if all hosts have failed and the running result is not ok 41445 1727204198.82871: done checking to see if all hosts have failed 41445 1727204198.82872: getting the remaining hosts for this loop 41445 1727204198.82873: done getting the remaining hosts for this loop 41445 1727204198.82879: getting the next task for host managed-node3 41445 1727204198.82888: done getting next task for host managed-node3 41445 1727204198.82891: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 41445 1727204198.82895: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204198.82916: getting variables 41445 1727204198.82918: in VariableManager get_vars() 41445 1727204198.82956: Calling all_inventory to load vars for managed-node3 41445 1727204198.82958: Calling groups_inventory to load vars for managed-node3 41445 1727204198.82960: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204198.82969: Calling all_plugins_play to load vars for managed-node3 41445 1727204198.82971: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204198.82973: Calling groups_plugins_play to load vars for managed-node3 41445 1727204198.85208: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204198.86741: done with get_vars() 41445 1727204198.86760: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:56:38 -0400 (0:00:00.058) 0:00:17.656 ***** 41445 1727204198.86847: entering _queue_task() for managed-node3/service_facts 41445 1727204198.87403: worker is 1 (out of 1 available) 41445 1727204198.87413: exiting _queue_task() for managed-node3/service_facts 41445 1727204198.87423: done queuing things up, now waiting for results queue to drain 41445 1727204198.87424: waiting for pending results... 41445 1727204198.87854: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which services are running 41445 1727204198.87859: in run() - task 028d2410-947f-bf02-eee4-0000000005e4 41445 1727204198.87880: variable 'ansible_search_path' from source: unknown 41445 1727204198.87884: variable 'ansible_search_path' from source: unknown 41445 1727204198.87887: calling self._execute() 41445 1727204198.87889: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204198.87892: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204198.87896: variable 'omit' from source: magic vars 41445 1727204198.88142: variable 'ansible_distribution_major_version' from source: facts 41445 1727204198.88154: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204198.88160: variable 'omit' from source: magic vars 41445 1727204198.88381: variable 'omit' from source: magic vars 41445 1727204198.88385: variable 'omit' from source: magic vars 41445 1727204198.88388: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41445 1727204198.88391: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41445 1727204198.88393: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41445 1727204198.88395: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204198.88397: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204198.88424: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41445 1727204198.88427: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204198.88430: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204198.88781: Set connection var ansible_shell_executable to /bin/sh 41445 1727204198.88784: Set connection var ansible_shell_type to sh 41445 1727204198.88787: Set connection var ansible_pipelining to False 41445 1727204198.88789: Set connection var ansible_timeout to 10 41445 1727204198.88792: Set connection var ansible_connection to ssh 41445 1727204198.88794: Set connection var ansible_module_compression to ZIP_DEFLATED 41445 1727204198.88796: variable 'ansible_shell_executable' from source: unknown 41445 1727204198.88798: variable 'ansible_connection' from source: unknown 41445 1727204198.88801: variable 'ansible_module_compression' from source: unknown 41445 1727204198.88803: variable 'ansible_shell_type' from source: unknown 41445 1727204198.88805: variable 'ansible_shell_executable' from source: unknown 41445 1727204198.88807: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204198.88812: variable 'ansible_pipelining' from source: unknown 41445 1727204198.88814: variable 'ansible_timeout' from source: unknown 41445 1727204198.88817: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204198.88819: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 41445 1727204198.88823: variable 'omit' from source: magic vars 41445 1727204198.88825: starting attempt loop 41445 1727204198.88827: running the handler 41445 1727204198.88829: _low_level_execute_command(): starting 41445 1727204198.88831: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41445 1727204198.89523: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204198.89535: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204198.89546: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204198.89561: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204198.89574: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 <<< 41445 1727204198.89589: stderr chunk (state=3): >>>debug2: match not found <<< 41445 1727204198.89687: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204198.90059: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204198.90104: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204198.91791: stdout chunk (state=3): >>>/root <<< 41445 1727204198.91933: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204198.91940: stdout chunk (state=3): >>><<< 41445 1727204198.91948: stderr chunk (state=3): >>><<< 41445 1727204198.92000: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204198.92015: _low_level_execute_command(): starting 41445 1727204198.92022: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204198.9199986-42792-268871558269478 `" && echo ansible-tmp-1727204198.9199986-42792-268871558269478="` echo /root/.ansible/tmp/ansible-tmp-1727204198.9199986-42792-268871558269478 `" ) && sleep 0' 41445 1727204198.93282: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204198.93286: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204198.93293: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204198.93302: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204198.93304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found <<< 41445 1727204198.93305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204198.93631: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204198.95460: stdout chunk (state=3): >>>ansible-tmp-1727204198.9199986-42792-268871558269478=/root/.ansible/tmp/ansible-tmp-1727204198.9199986-42792-268871558269478 <<< 41445 1727204198.95649: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204198.95653: stderr chunk (state=3): >>><<< 41445 1727204198.95655: stdout chunk (state=3): >>><<< 41445 1727204198.95679: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204198.9199986-42792-268871558269478=/root/.ansible/tmp/ansible-tmp-1727204198.9199986-42792-268871558269478 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204198.95923: variable 'ansible_module_compression' from source: unknown 41445 1727204198.95965: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-414450s0ylvj1/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 41445 1727204198.96013: variable 'ansible_facts' from source: unknown 41445 1727204198.96223: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204198.9199986-42792-268871558269478/AnsiballZ_service_facts.py 41445 1727204198.96615: Sending initial data 41445 1727204198.96618: Sent initial data (162 bytes) 41445 1727204198.97594: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204198.97788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204198.98013: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204198.98043: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204198.99735: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 debug2: Sending SSH2_FXP_REALPATH "." debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204198.9199986-42792-268871558269478/AnsiballZ_service_facts.py" <<< 41445 1727204198.99739: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-414450s0ylvj1/tmp1yuickds /root/.ansible/tmp/ansible-tmp-1727204198.9199986-42792-268871558269478/AnsiballZ_service_facts.py <<< 41445 1727204198.99743: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-414450s0ylvj1/tmp1yuickds" to remote "/root/.ansible/tmp/ansible-tmp-1727204198.9199986-42792-268871558269478/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204198.9199986-42792-268871558269478/AnsiballZ_service_facts.py" <<< 41445 1727204199.01023: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204199.01027: stdout chunk (state=3): >>><<< 41445 1727204199.01033: stderr chunk (state=3): >>><<< 41445 1727204199.01081: done transferring module to remote 41445 1727204199.01084: _low_level_execute_command(): starting 41445 1727204199.01086: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204198.9199986-42792-268871558269478/ /root/.ansible/tmp/ansible-tmp-1727204198.9199986-42792-268871558269478/AnsiballZ_service_facts.py && sleep 0' 41445 1727204199.01982: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204199.01986: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204199.01988: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204199.01991: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204199.02026: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204199.03851: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204199.03855: stdout chunk (state=3): >>><<< 41445 1727204199.03862: stderr chunk (state=3): >>><<< 41445 1727204199.03877: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204199.03880: _low_level_execute_command(): starting 41445 1727204199.03885: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204198.9199986-42792-268871558269478/AnsiballZ_service_facts.py && sleep 0' 41445 1727204199.04515: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204199.04571: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204199.04585: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204199.04627: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204199.04652: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204200.53212: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "st<<< 41445 1727204200.53222: stdout chunk (state=3): >>>opped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source":<<< 41445 1727204200.53247: stdout chunk (state=3): >>> "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.<<< 41445 1727204200.53277: stdout chunk (state=3): >>>service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integratio<<< 41445 1727204200.53285: stdout chunk (state=3): >>>n.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 41445 1727204200.54797: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. <<< 41445 1727204200.54802: stdout chunk (state=3): >>><<< 41445 1727204200.54805: stderr chunk (state=3): >>><<< 41445 1727204200.54860: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. 41445 1727204200.55268: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204198.9199986-42792-268871558269478/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41445 1727204200.55277: _low_level_execute_command(): starting 41445 1727204200.55282: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204198.9199986-42792-268871558269478/ > /dev/null 2>&1 && sleep 0' 41445 1727204200.55720: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204200.55724: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found <<< 41445 1727204200.55726: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204200.55729: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204200.55731: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204200.55781: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204200.55787: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204200.55820: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204200.57602: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204200.57621: stderr chunk (state=3): >>><<< 41445 1727204200.57627: stdout chunk (state=3): >>><<< 41445 1727204200.57643: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204200.57649: handler run complete 41445 1727204200.57767: variable 'ansible_facts' from source: unknown 41445 1727204200.57865: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204200.58142: variable 'ansible_facts' from source: unknown 41445 1727204200.58228: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204200.58344: attempt loop complete, returning result 41445 1727204200.58347: _execute() done 41445 1727204200.58351: dumping result to json 41445 1727204200.58392: done dumping result, returning 41445 1727204200.58401: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which services are running [028d2410-947f-bf02-eee4-0000000005e4] 41445 1727204200.58406: sending task result for task 028d2410-947f-bf02-eee4-0000000005e4 ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 41445 1727204200.59009: no more pending results, returning what we have 41445 1727204200.59011: results queue empty 41445 1727204200.59012: checking for any_errors_fatal 41445 1727204200.59016: done checking for any_errors_fatal 41445 1727204200.59017: checking for max_fail_percentage 41445 1727204200.59019: done checking for max_fail_percentage 41445 1727204200.59020: checking to see if all hosts have failed and the running result is not ok 41445 1727204200.59020: done checking to see if all hosts have failed 41445 1727204200.59021: getting the remaining hosts for this loop 41445 1727204200.59022: done getting the remaining hosts for this loop 41445 1727204200.59025: getting the next task for host managed-node3 41445 1727204200.59030: done getting next task for host managed-node3 41445 1727204200.59033: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 41445 1727204200.59036: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204200.59046: getting variables 41445 1727204200.59047: in VariableManager get_vars() 41445 1727204200.59080: Calling all_inventory to load vars for managed-node3 41445 1727204200.59082: Calling groups_inventory to load vars for managed-node3 41445 1727204200.59084: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204200.59092: Calling all_plugins_play to load vars for managed-node3 41445 1727204200.59093: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204200.59096: Calling groups_plugins_play to load vars for managed-node3 41445 1727204200.59669: done sending task result for task 028d2410-947f-bf02-eee4-0000000005e4 41445 1727204200.59672: WORKER PROCESS EXITING 41445 1727204200.59915: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204200.60790: done with get_vars() 41445 1727204200.60805: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:56:40 -0400 (0:00:01.740) 0:00:19.396 ***** 41445 1727204200.60880: entering _queue_task() for managed-node3/package_facts 41445 1727204200.61119: worker is 1 (out of 1 available) 41445 1727204200.61132: exiting _queue_task() for managed-node3/package_facts 41445 1727204200.61143: done queuing things up, now waiting for results queue to drain 41445 1727204200.61145: waiting for pending results... 41445 1727204200.61327: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 41445 1727204200.61431: in run() - task 028d2410-947f-bf02-eee4-0000000005e5 41445 1727204200.61443: variable 'ansible_search_path' from source: unknown 41445 1727204200.61447: variable 'ansible_search_path' from source: unknown 41445 1727204200.61475: calling self._execute() 41445 1727204200.61548: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204200.61552: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204200.61560: variable 'omit' from source: magic vars 41445 1727204200.61846: variable 'ansible_distribution_major_version' from source: facts 41445 1727204200.61856: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204200.61861: variable 'omit' from source: magic vars 41445 1727204200.61913: variable 'omit' from source: magic vars 41445 1727204200.61937: variable 'omit' from source: magic vars 41445 1727204200.61969: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41445 1727204200.61997: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41445 1727204200.62015: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41445 1727204200.62033: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204200.62039: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204200.62063: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41445 1727204200.62066: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204200.62069: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204200.62138: Set connection var ansible_shell_executable to /bin/sh 41445 1727204200.62142: Set connection var ansible_shell_type to sh 41445 1727204200.62145: Set connection var ansible_pipelining to False 41445 1727204200.62155: Set connection var ansible_timeout to 10 41445 1727204200.62157: Set connection var ansible_connection to ssh 41445 1727204200.62160: Set connection var ansible_module_compression to ZIP_DEFLATED 41445 1727204200.62179: variable 'ansible_shell_executable' from source: unknown 41445 1727204200.62182: variable 'ansible_connection' from source: unknown 41445 1727204200.62186: variable 'ansible_module_compression' from source: unknown 41445 1727204200.62188: variable 'ansible_shell_type' from source: unknown 41445 1727204200.62191: variable 'ansible_shell_executable' from source: unknown 41445 1727204200.62193: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204200.62195: variable 'ansible_pipelining' from source: unknown 41445 1727204200.62198: variable 'ansible_timeout' from source: unknown 41445 1727204200.62202: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204200.62345: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 41445 1727204200.62352: variable 'omit' from source: magic vars 41445 1727204200.62355: starting attempt loop 41445 1727204200.62358: running the handler 41445 1727204200.62373: _low_level_execute_command(): starting 41445 1727204200.62383: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41445 1727204200.62895: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204200.62899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204200.62903: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 <<< 41445 1727204200.62906: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204200.62957: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204200.62960: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204200.62963: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204200.63002: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204200.64561: stdout chunk (state=3): >>>/root <<< 41445 1727204200.64663: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204200.64692: stderr chunk (state=3): >>><<< 41445 1727204200.64696: stdout chunk (state=3): >>><<< 41445 1727204200.64715: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204200.64726: _low_level_execute_command(): starting 41445 1727204200.64732: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204200.6471372-42865-255429138135584 `" && echo ansible-tmp-1727204200.6471372-42865-255429138135584="` echo /root/.ansible/tmp/ansible-tmp-1727204200.6471372-42865-255429138135584 `" ) && sleep 0' 41445 1727204200.65172: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204200.65177: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found <<< 41445 1727204200.65180: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 41445 1727204200.65189: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204200.65191: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204200.65238: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204200.65245: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204200.65247: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204200.65279: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204200.67115: stdout chunk (state=3): >>>ansible-tmp-1727204200.6471372-42865-255429138135584=/root/.ansible/tmp/ansible-tmp-1727204200.6471372-42865-255429138135584 <<< 41445 1727204200.67222: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204200.67250: stderr chunk (state=3): >>><<< 41445 1727204200.67253: stdout chunk (state=3): >>><<< 41445 1727204200.67267: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204200.6471372-42865-255429138135584=/root/.ansible/tmp/ansible-tmp-1727204200.6471372-42865-255429138135584 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204200.67304: variable 'ansible_module_compression' from source: unknown 41445 1727204200.67340: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-414450s0ylvj1/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 41445 1727204200.67390: variable 'ansible_facts' from source: unknown 41445 1727204200.67507: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204200.6471372-42865-255429138135584/AnsiballZ_package_facts.py 41445 1727204200.67608: Sending initial data 41445 1727204200.67611: Sent initial data (162 bytes) 41445 1727204200.68048: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204200.68052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found <<< 41445 1727204200.68054: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204200.68056: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204200.68058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204200.68119: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204200.68127: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204200.68133: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204200.68152: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204200.69672: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41445 1727204200.69764: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41445 1727204200.69769: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-414450s0ylvj1/tmpd4bzcjp9 /root/.ansible/tmp/ansible-tmp-1727204200.6471372-42865-255429138135584/AnsiballZ_package_facts.py <<< 41445 1727204200.69771: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204200.6471372-42865-255429138135584/AnsiballZ_package_facts.py" <<< 41445 1727204200.69842: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-414450s0ylvj1/tmpd4bzcjp9" to remote "/root/.ansible/tmp/ansible-tmp-1727204200.6471372-42865-255429138135584/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204200.6471372-42865-255429138135584/AnsiballZ_package_facts.py" <<< 41445 1727204200.71367: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204200.71416: stderr chunk (state=3): >>><<< 41445 1727204200.71448: stdout chunk (state=3): >>><<< 41445 1727204200.71557: done transferring module to remote 41445 1727204200.71561: _low_level_execute_command(): starting 41445 1727204200.71563: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204200.6471372-42865-255429138135584/ /root/.ansible/tmp/ansible-tmp-1727204200.6471372-42865-255429138135584/AnsiballZ_package_facts.py && sleep 0' 41445 1727204200.72240: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204200.72296: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204200.72387: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204200.72467: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204200.72489: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204200.74317: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204200.74339: stderr chunk (state=3): >>><<< 41445 1727204200.74342: stdout chunk (state=3): >>><<< 41445 1727204200.74356: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204200.74360: _low_level_execute_command(): starting 41445 1727204200.74379: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204200.6471372-42865-255429138135584/AnsiballZ_package_facts.py && sleep 0' 41445 1727204200.74795: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204200.74799: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204200.74822: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204200.74865: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204200.74868: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204200.74917: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204201.19497: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 41445 1727204201.19519: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "rele<<< 41445 1727204201.19539: stdout chunk (state=3): >>>ase": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null,<<< 41445 1727204201.19567: stdout chunk (state=3): >>> "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10",<<< 41445 1727204201.19589: stdout chunk (state=3): >>> "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source":<<< 41445 1727204201.19620: stdout chunk (state=3): >>> "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [<<< 41445 1727204201.19639: stdout chunk (state=3): >>>{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "<<< 41445 1727204201.19648: stdout chunk (state=3): >>>3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name":<<< 41445 1727204201.19680: stdout chunk (state=3): >>> "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch<<< 41445 1727204201.19688: stdout chunk (state=3): >>>": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch",<<< 41445 1727204201.19701: stdout chunk (state=3): >>> "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch<<< 41445 1727204201.19713: stdout chunk (state=3): >>>": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 41445 1727204201.21485: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. <<< 41445 1727204201.21520: stderr chunk (state=3): >>><<< 41445 1727204201.21524: stdout chunk (state=3): >>><<< 41445 1727204201.21573: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. 41445 1727204201.22874: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204200.6471372-42865-255429138135584/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41445 1727204201.22894: _low_level_execute_command(): starting 41445 1727204201.22897: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204200.6471372-42865-255429138135584/ > /dev/null 2>&1 && sleep 0' 41445 1727204201.23368: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204201.23371: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204201.23373: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204201.23378: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204201.23431: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204201.23434: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204201.23437: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204201.23482: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204201.25272: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204201.25301: stderr chunk (state=3): >>><<< 41445 1727204201.25305: stdout chunk (state=3): >>><<< 41445 1727204201.25322: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204201.25328: handler run complete 41445 1727204201.25776: variable 'ansible_facts' from source: unknown 41445 1727204201.26035: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204201.27076: variable 'ansible_facts' from source: unknown 41445 1727204201.27364: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204201.27739: attempt loop complete, returning result 41445 1727204201.27747: _execute() done 41445 1727204201.27750: dumping result to json 41445 1727204201.27863: done dumping result, returning 41445 1727204201.27871: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [028d2410-947f-bf02-eee4-0000000005e5] 41445 1727204201.27879: sending task result for task 028d2410-947f-bf02-eee4-0000000005e5 ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 41445 1727204201.29193: done sending task result for task 028d2410-947f-bf02-eee4-0000000005e5 41445 1727204201.29197: WORKER PROCESS EXITING 41445 1727204201.29206: no more pending results, returning what we have 41445 1727204201.29208: results queue empty 41445 1727204201.29209: checking for any_errors_fatal 41445 1727204201.29213: done checking for any_errors_fatal 41445 1727204201.29214: checking for max_fail_percentage 41445 1727204201.29215: done checking for max_fail_percentage 41445 1727204201.29215: checking to see if all hosts have failed and the running result is not ok 41445 1727204201.29216: done checking to see if all hosts have failed 41445 1727204201.29216: getting the remaining hosts for this loop 41445 1727204201.29217: done getting the remaining hosts for this loop 41445 1727204201.29220: getting the next task for host managed-node3 41445 1727204201.29225: done getting next task for host managed-node3 41445 1727204201.29227: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 41445 1727204201.29229: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204201.29239: getting variables 41445 1727204201.29240: in VariableManager get_vars() 41445 1727204201.29265: Calling all_inventory to load vars for managed-node3 41445 1727204201.29267: Calling groups_inventory to load vars for managed-node3 41445 1727204201.29268: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204201.29278: Calling all_plugins_play to load vars for managed-node3 41445 1727204201.29280: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204201.29282: Calling groups_plugins_play to load vars for managed-node3 41445 1727204201.30025: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204201.30886: done with get_vars() 41445 1727204201.30903: done getting variables 41445 1727204201.30947: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:56:41 -0400 (0:00:00.700) 0:00:20.097 ***** 41445 1727204201.30979: entering _queue_task() for managed-node3/debug 41445 1727204201.31222: worker is 1 (out of 1 available) 41445 1727204201.31239: exiting _queue_task() for managed-node3/debug 41445 1727204201.31250: done queuing things up, now waiting for results queue to drain 41445 1727204201.31252: waiting for pending results... 41445 1727204201.31433: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider 41445 1727204201.31527: in run() - task 028d2410-947f-bf02-eee4-000000000068 41445 1727204201.31541: variable 'ansible_search_path' from source: unknown 41445 1727204201.31546: variable 'ansible_search_path' from source: unknown 41445 1727204201.31572: calling self._execute() 41445 1727204201.31647: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204201.31651: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204201.31659: variable 'omit' from source: magic vars 41445 1727204201.31943: variable 'ansible_distribution_major_version' from source: facts 41445 1727204201.31954: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204201.31959: variable 'omit' from source: magic vars 41445 1727204201.31996: variable 'omit' from source: magic vars 41445 1727204201.32069: variable 'network_provider' from source: set_fact 41445 1727204201.32085: variable 'omit' from source: magic vars 41445 1727204201.32120: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41445 1727204201.32149: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41445 1727204201.32163: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41445 1727204201.32180: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204201.32190: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204201.32212: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41445 1727204201.32218: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204201.32220: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204201.32290: Set connection var ansible_shell_executable to /bin/sh 41445 1727204201.32294: Set connection var ansible_shell_type to sh 41445 1727204201.32297: Set connection var ansible_pipelining to False 41445 1727204201.32305: Set connection var ansible_timeout to 10 41445 1727204201.32308: Set connection var ansible_connection to ssh 41445 1727204201.32317: Set connection var ansible_module_compression to ZIP_DEFLATED 41445 1727204201.32334: variable 'ansible_shell_executable' from source: unknown 41445 1727204201.32337: variable 'ansible_connection' from source: unknown 41445 1727204201.32339: variable 'ansible_module_compression' from source: unknown 41445 1727204201.32344: variable 'ansible_shell_type' from source: unknown 41445 1727204201.32346: variable 'ansible_shell_executable' from source: unknown 41445 1727204201.32349: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204201.32351: variable 'ansible_pipelining' from source: unknown 41445 1727204201.32353: variable 'ansible_timeout' from source: unknown 41445 1727204201.32356: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204201.32458: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41445 1727204201.32468: variable 'omit' from source: magic vars 41445 1727204201.32471: starting attempt loop 41445 1727204201.32473: running the handler 41445 1727204201.32517: handler run complete 41445 1727204201.32526: attempt loop complete, returning result 41445 1727204201.32529: _execute() done 41445 1727204201.32531: dumping result to json 41445 1727204201.32534: done dumping result, returning 41445 1727204201.32541: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider [028d2410-947f-bf02-eee4-000000000068] 41445 1727204201.32547: sending task result for task 028d2410-947f-bf02-eee4-000000000068 41445 1727204201.32624: done sending task result for task 028d2410-947f-bf02-eee4-000000000068 41445 1727204201.32627: WORKER PROCESS EXITING ok: [managed-node3] => {} MSG: Using network provider: nm 41445 1727204201.32691: no more pending results, returning what we have 41445 1727204201.32695: results queue empty 41445 1727204201.32695: checking for any_errors_fatal 41445 1727204201.32704: done checking for any_errors_fatal 41445 1727204201.32705: checking for max_fail_percentage 41445 1727204201.32707: done checking for max_fail_percentage 41445 1727204201.32707: checking to see if all hosts have failed and the running result is not ok 41445 1727204201.32708: done checking to see if all hosts have failed 41445 1727204201.32709: getting the remaining hosts for this loop 41445 1727204201.32710: done getting the remaining hosts for this loop 41445 1727204201.32714: getting the next task for host managed-node3 41445 1727204201.32720: done getting next task for host managed-node3 41445 1727204201.32723: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 41445 1727204201.32725: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204201.32737: getting variables 41445 1727204201.32739: in VariableManager get_vars() 41445 1727204201.32777: Calling all_inventory to load vars for managed-node3 41445 1727204201.32779: Calling groups_inventory to load vars for managed-node3 41445 1727204201.32781: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204201.32790: Calling all_plugins_play to load vars for managed-node3 41445 1727204201.32793: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204201.32795: Calling groups_plugins_play to load vars for managed-node3 41445 1727204201.33555: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204201.34414: done with get_vars() 41445 1727204201.34431: done getting variables 41445 1727204201.34471: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:56:41 -0400 (0:00:00.035) 0:00:20.132 ***** 41445 1727204201.34496: entering _queue_task() for managed-node3/fail 41445 1727204201.34724: worker is 1 (out of 1 available) 41445 1727204201.34737: exiting _queue_task() for managed-node3/fail 41445 1727204201.34749: done queuing things up, now waiting for results queue to drain 41445 1727204201.34750: waiting for pending results... 41445 1727204201.34930: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 41445 1727204201.35020: in run() - task 028d2410-947f-bf02-eee4-000000000069 41445 1727204201.35033: variable 'ansible_search_path' from source: unknown 41445 1727204201.35036: variable 'ansible_search_path' from source: unknown 41445 1727204201.35064: calling self._execute() 41445 1727204201.35141: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204201.35145: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204201.35154: variable 'omit' from source: magic vars 41445 1727204201.35440: variable 'ansible_distribution_major_version' from source: facts 41445 1727204201.35450: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204201.35537: variable 'network_state' from source: role '' defaults 41445 1727204201.35545: Evaluated conditional (network_state != {}): False 41445 1727204201.35549: when evaluation is False, skipping this task 41445 1727204201.35551: _execute() done 41445 1727204201.35554: dumping result to json 41445 1727204201.35556: done dumping result, returning 41445 1727204201.35563: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [028d2410-947f-bf02-eee4-000000000069] 41445 1727204201.35568: sending task result for task 028d2410-947f-bf02-eee4-000000000069 41445 1727204201.35651: done sending task result for task 028d2410-947f-bf02-eee4-000000000069 41445 1727204201.35654: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 41445 1727204201.35702: no more pending results, returning what we have 41445 1727204201.35706: results queue empty 41445 1727204201.35706: checking for any_errors_fatal 41445 1727204201.35714: done checking for any_errors_fatal 41445 1727204201.35715: checking for max_fail_percentage 41445 1727204201.35717: done checking for max_fail_percentage 41445 1727204201.35717: checking to see if all hosts have failed and the running result is not ok 41445 1727204201.35718: done checking to see if all hosts have failed 41445 1727204201.35719: getting the remaining hosts for this loop 41445 1727204201.35720: done getting the remaining hosts for this loop 41445 1727204201.35724: getting the next task for host managed-node3 41445 1727204201.35731: done getting next task for host managed-node3 41445 1727204201.35734: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 41445 1727204201.35736: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204201.35755: getting variables 41445 1727204201.35757: in VariableManager get_vars() 41445 1727204201.35793: Calling all_inventory to load vars for managed-node3 41445 1727204201.35796: Calling groups_inventory to load vars for managed-node3 41445 1727204201.35798: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204201.35806: Calling all_plugins_play to load vars for managed-node3 41445 1727204201.35808: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204201.35811: Calling groups_plugins_play to load vars for managed-node3 41445 1727204201.36683: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204201.37546: done with get_vars() 41445 1727204201.37561: done getting variables 41445 1727204201.37603: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:56:41 -0400 (0:00:00.031) 0:00:20.164 ***** 41445 1727204201.37629: entering _queue_task() for managed-node3/fail 41445 1727204201.37845: worker is 1 (out of 1 available) 41445 1727204201.37859: exiting _queue_task() for managed-node3/fail 41445 1727204201.37871: done queuing things up, now waiting for results queue to drain 41445 1727204201.37872: waiting for pending results... 41445 1727204201.38050: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 41445 1727204201.38141: in run() - task 028d2410-947f-bf02-eee4-00000000006a 41445 1727204201.38152: variable 'ansible_search_path' from source: unknown 41445 1727204201.38156: variable 'ansible_search_path' from source: unknown 41445 1727204201.38185: calling self._execute() 41445 1727204201.38255: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204201.38259: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204201.38268: variable 'omit' from source: magic vars 41445 1727204201.38536: variable 'ansible_distribution_major_version' from source: facts 41445 1727204201.38552: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204201.38635: variable 'network_state' from source: role '' defaults 41445 1727204201.38650: Evaluated conditional (network_state != {}): False 41445 1727204201.38654: when evaluation is False, skipping this task 41445 1727204201.38656: _execute() done 41445 1727204201.38659: dumping result to json 41445 1727204201.38661: done dumping result, returning 41445 1727204201.38664: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [028d2410-947f-bf02-eee4-00000000006a] 41445 1727204201.38667: sending task result for task 028d2410-947f-bf02-eee4-00000000006a 41445 1727204201.38750: done sending task result for task 028d2410-947f-bf02-eee4-00000000006a 41445 1727204201.38753: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 41445 1727204201.38801: no more pending results, returning what we have 41445 1727204201.38805: results queue empty 41445 1727204201.38806: checking for any_errors_fatal 41445 1727204201.38813: done checking for any_errors_fatal 41445 1727204201.38814: checking for max_fail_percentage 41445 1727204201.38816: done checking for max_fail_percentage 41445 1727204201.38817: checking to see if all hosts have failed and the running result is not ok 41445 1727204201.38818: done checking to see if all hosts have failed 41445 1727204201.38819: getting the remaining hosts for this loop 41445 1727204201.38820: done getting the remaining hosts for this loop 41445 1727204201.38823: getting the next task for host managed-node3 41445 1727204201.38830: done getting next task for host managed-node3 41445 1727204201.38833: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 41445 1727204201.38835: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204201.38852: getting variables 41445 1727204201.38853: in VariableManager get_vars() 41445 1727204201.38888: Calling all_inventory to load vars for managed-node3 41445 1727204201.38890: Calling groups_inventory to load vars for managed-node3 41445 1727204201.38892: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204201.38900: Calling all_plugins_play to load vars for managed-node3 41445 1727204201.38902: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204201.38905: Calling groups_plugins_play to load vars for managed-node3 41445 1727204201.39643: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204201.40500: done with get_vars() 41445 1727204201.40516: done getting variables 41445 1727204201.40556: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:56:41 -0400 (0:00:00.029) 0:00:20.193 ***** 41445 1727204201.40579: entering _queue_task() for managed-node3/fail 41445 1727204201.40785: worker is 1 (out of 1 available) 41445 1727204201.40799: exiting _queue_task() for managed-node3/fail 41445 1727204201.40811: done queuing things up, now waiting for results queue to drain 41445 1727204201.40812: waiting for pending results... 41445 1727204201.40983: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 41445 1727204201.41068: in run() - task 028d2410-947f-bf02-eee4-00000000006b 41445 1727204201.41080: variable 'ansible_search_path' from source: unknown 41445 1727204201.41083: variable 'ansible_search_path' from source: unknown 41445 1727204201.41111: calling self._execute() 41445 1727204201.41183: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204201.41186: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204201.41195: variable 'omit' from source: magic vars 41445 1727204201.41463: variable 'ansible_distribution_major_version' from source: facts 41445 1727204201.41479: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204201.41596: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41445 1727204201.43269: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41445 1727204201.43329: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41445 1727204201.43351: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41445 1727204201.43378: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41445 1727204201.43397: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41445 1727204201.43458: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204201.43480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204201.43498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204201.43528: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204201.43543: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204201.43605: variable 'ansible_distribution_major_version' from source: facts 41445 1727204201.43619: Evaluated conditional (ansible_distribution_major_version | int > 9): True 41445 1727204201.43697: variable 'ansible_distribution' from source: facts 41445 1727204201.43700: variable '__network_rh_distros' from source: role '' defaults 41445 1727204201.43708: Evaluated conditional (ansible_distribution in __network_rh_distros): True 41445 1727204201.43871: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204201.43887: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204201.43904: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204201.43933: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204201.43943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204201.43981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204201.43996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204201.44015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204201.44040: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204201.44050: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204201.44082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204201.44099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204201.44118: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204201.44142: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204201.44152: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204201.44349: variable 'network_connections' from source: task vars 41445 1727204201.44358: variable 'interface' from source: set_fact 41445 1727204201.44407: variable 'interface' from source: set_fact 41445 1727204201.44420: variable 'interface' from source: set_fact 41445 1727204201.44461: variable 'interface' from source: set_fact 41445 1727204201.44473: variable 'network_state' from source: role '' defaults 41445 1727204201.44521: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41445 1727204201.44626: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41445 1727204201.44656: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41445 1727204201.44690: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41445 1727204201.44712: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41445 1727204201.44745: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41445 1727204201.44767: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41445 1727204201.44787: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204201.44805: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41445 1727204201.44832: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 41445 1727204201.44836: when evaluation is False, skipping this task 41445 1727204201.44838: _execute() done 41445 1727204201.44840: dumping result to json 41445 1727204201.44846: done dumping result, returning 41445 1727204201.44856: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [028d2410-947f-bf02-eee4-00000000006b] 41445 1727204201.44859: sending task result for task 028d2410-947f-bf02-eee4-00000000006b 41445 1727204201.44939: done sending task result for task 028d2410-947f-bf02-eee4-00000000006b 41445 1727204201.44942: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 41445 1727204201.44995: no more pending results, returning what we have 41445 1727204201.44998: results queue empty 41445 1727204201.44999: checking for any_errors_fatal 41445 1727204201.45004: done checking for any_errors_fatal 41445 1727204201.45004: checking for max_fail_percentage 41445 1727204201.45006: done checking for max_fail_percentage 41445 1727204201.45007: checking to see if all hosts have failed and the running result is not ok 41445 1727204201.45007: done checking to see if all hosts have failed 41445 1727204201.45008: getting the remaining hosts for this loop 41445 1727204201.45011: done getting the remaining hosts for this loop 41445 1727204201.45015: getting the next task for host managed-node3 41445 1727204201.45023: done getting next task for host managed-node3 41445 1727204201.45026: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 41445 1727204201.45029: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204201.45046: getting variables 41445 1727204201.45048: in VariableManager get_vars() 41445 1727204201.45089: Calling all_inventory to load vars for managed-node3 41445 1727204201.45091: Calling groups_inventory to load vars for managed-node3 41445 1727204201.45093: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204201.45103: Calling all_plugins_play to load vars for managed-node3 41445 1727204201.45106: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204201.45108: Calling groups_plugins_play to load vars for managed-node3 41445 1727204201.45994: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204201.46862: done with get_vars() 41445 1727204201.46879: done getting variables 41445 1727204201.46925: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:56:41 -0400 (0:00:00.063) 0:00:20.257 ***** 41445 1727204201.46946: entering _queue_task() for managed-node3/dnf 41445 1727204201.47170: worker is 1 (out of 1 available) 41445 1727204201.47186: exiting _queue_task() for managed-node3/dnf 41445 1727204201.47198: done queuing things up, now waiting for results queue to drain 41445 1727204201.47199: waiting for pending results... 41445 1727204201.47368: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 41445 1727204201.47455: in run() - task 028d2410-947f-bf02-eee4-00000000006c 41445 1727204201.47467: variable 'ansible_search_path' from source: unknown 41445 1727204201.47471: variable 'ansible_search_path' from source: unknown 41445 1727204201.47499: calling self._execute() 41445 1727204201.47570: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204201.47575: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204201.47585: variable 'omit' from source: magic vars 41445 1727204201.47850: variable 'ansible_distribution_major_version' from source: facts 41445 1727204201.47865: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204201.48000: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41445 1727204201.49469: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41445 1727204201.49518: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41445 1727204201.49545: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41445 1727204201.49569: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41445 1727204201.49594: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41445 1727204201.49654: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204201.49673: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204201.49693: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204201.49724: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204201.49735: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204201.49823: variable 'ansible_distribution' from source: facts 41445 1727204201.49826: variable 'ansible_distribution_major_version' from source: facts 41445 1727204201.49833: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 41445 1727204201.49911: variable '__network_wireless_connections_defined' from source: role '' defaults 41445 1727204201.49996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204201.50015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204201.50032: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204201.50061: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204201.50072: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204201.50101: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204201.50118: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204201.50135: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204201.50163: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204201.50173: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204201.50202: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204201.50220: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204201.50236: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204201.50264: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204201.50274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204201.50380: variable 'network_connections' from source: task vars 41445 1727204201.50383: variable 'interface' from source: set_fact 41445 1727204201.50430: variable 'interface' from source: set_fact 41445 1727204201.50437: variable 'interface' from source: set_fact 41445 1727204201.50481: variable 'interface' from source: set_fact 41445 1727204201.50531: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41445 1727204201.50648: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41445 1727204201.50676: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41445 1727204201.50701: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41445 1727204201.50724: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41445 1727204201.50753: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41445 1727204201.50768: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41445 1727204201.50791: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204201.50815: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41445 1727204201.50855: variable '__network_team_connections_defined' from source: role '' defaults 41445 1727204201.51000: variable 'network_connections' from source: task vars 41445 1727204201.51003: variable 'interface' from source: set_fact 41445 1727204201.51049: variable 'interface' from source: set_fact 41445 1727204201.51055: variable 'interface' from source: set_fact 41445 1727204201.51098: variable 'interface' from source: set_fact 41445 1727204201.51131: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 41445 1727204201.51135: when evaluation is False, skipping this task 41445 1727204201.51137: _execute() done 41445 1727204201.51139: dumping result to json 41445 1727204201.51141: done dumping result, returning 41445 1727204201.51144: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [028d2410-947f-bf02-eee4-00000000006c] 41445 1727204201.51148: sending task result for task 028d2410-947f-bf02-eee4-00000000006c 41445 1727204201.51231: done sending task result for task 028d2410-947f-bf02-eee4-00000000006c 41445 1727204201.51235: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 41445 1727204201.51287: no more pending results, returning what we have 41445 1727204201.51290: results queue empty 41445 1727204201.51291: checking for any_errors_fatal 41445 1727204201.51296: done checking for any_errors_fatal 41445 1727204201.51297: checking for max_fail_percentage 41445 1727204201.51299: done checking for max_fail_percentage 41445 1727204201.51300: checking to see if all hosts have failed and the running result is not ok 41445 1727204201.51300: done checking to see if all hosts have failed 41445 1727204201.51301: getting the remaining hosts for this loop 41445 1727204201.51302: done getting the remaining hosts for this loop 41445 1727204201.51306: getting the next task for host managed-node3 41445 1727204201.51315: done getting next task for host managed-node3 41445 1727204201.51318: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 41445 1727204201.51321: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204201.51339: getting variables 41445 1727204201.51341: in VariableManager get_vars() 41445 1727204201.51382: Calling all_inventory to load vars for managed-node3 41445 1727204201.51384: Calling groups_inventory to load vars for managed-node3 41445 1727204201.51386: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204201.51395: Calling all_plugins_play to load vars for managed-node3 41445 1727204201.51397: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204201.51400: Calling groups_plugins_play to load vars for managed-node3 41445 1727204201.52196: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204201.53160: done with get_vars() 41445 1727204201.53178: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 41445 1727204201.53237: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:56:41 -0400 (0:00:00.063) 0:00:20.320 ***** 41445 1727204201.53259: entering _queue_task() for managed-node3/yum 41445 1727204201.53503: worker is 1 (out of 1 available) 41445 1727204201.53519: exiting _queue_task() for managed-node3/yum 41445 1727204201.53531: done queuing things up, now waiting for results queue to drain 41445 1727204201.53532: waiting for pending results... 41445 1727204201.53708: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 41445 1727204201.53801: in run() - task 028d2410-947f-bf02-eee4-00000000006d 41445 1727204201.53815: variable 'ansible_search_path' from source: unknown 41445 1727204201.53818: variable 'ansible_search_path' from source: unknown 41445 1727204201.53845: calling self._execute() 41445 1727204201.53985: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204201.53989: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204201.53993: variable 'omit' from source: magic vars 41445 1727204201.54194: variable 'ansible_distribution_major_version' from source: facts 41445 1727204201.54206: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204201.54327: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41445 1727204201.55802: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41445 1727204201.55848: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41445 1727204201.55874: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41445 1727204201.55901: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41445 1727204201.55923: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41445 1727204201.55983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204201.56003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204201.56021: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204201.56051: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204201.56062: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204201.56129: variable 'ansible_distribution_major_version' from source: facts 41445 1727204201.56140: Evaluated conditional (ansible_distribution_major_version | int < 8): False 41445 1727204201.56145: when evaluation is False, skipping this task 41445 1727204201.56148: _execute() done 41445 1727204201.56150: dumping result to json 41445 1727204201.56152: done dumping result, returning 41445 1727204201.56163: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [028d2410-947f-bf02-eee4-00000000006d] 41445 1727204201.56166: sending task result for task 028d2410-947f-bf02-eee4-00000000006d 41445 1727204201.56247: done sending task result for task 028d2410-947f-bf02-eee4-00000000006d 41445 1727204201.56250: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 41445 1727204201.56310: no more pending results, returning what we have 41445 1727204201.56313: results queue empty 41445 1727204201.56314: checking for any_errors_fatal 41445 1727204201.56319: done checking for any_errors_fatal 41445 1727204201.56319: checking for max_fail_percentage 41445 1727204201.56321: done checking for max_fail_percentage 41445 1727204201.56322: checking to see if all hosts have failed and the running result is not ok 41445 1727204201.56323: done checking to see if all hosts have failed 41445 1727204201.56323: getting the remaining hosts for this loop 41445 1727204201.56324: done getting the remaining hosts for this loop 41445 1727204201.56328: getting the next task for host managed-node3 41445 1727204201.56335: done getting next task for host managed-node3 41445 1727204201.56339: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 41445 1727204201.56341: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204201.56361: getting variables 41445 1727204201.56362: in VariableManager get_vars() 41445 1727204201.56403: Calling all_inventory to load vars for managed-node3 41445 1727204201.56405: Calling groups_inventory to load vars for managed-node3 41445 1727204201.56407: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204201.56416: Calling all_plugins_play to load vars for managed-node3 41445 1727204201.56419: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204201.56421: Calling groups_plugins_play to load vars for managed-node3 41445 1727204201.57216: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204201.58084: done with get_vars() 41445 1727204201.58099: done getting variables 41445 1727204201.58141: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:56:41 -0400 (0:00:00.049) 0:00:20.369 ***** 41445 1727204201.58164: entering _queue_task() for managed-node3/fail 41445 1727204201.58388: worker is 1 (out of 1 available) 41445 1727204201.58401: exiting _queue_task() for managed-node3/fail 41445 1727204201.58413: done queuing things up, now waiting for results queue to drain 41445 1727204201.58414: waiting for pending results... 41445 1727204201.58592: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 41445 1727204201.58681: in run() - task 028d2410-947f-bf02-eee4-00000000006e 41445 1727204201.58693: variable 'ansible_search_path' from source: unknown 41445 1727204201.58697: variable 'ansible_search_path' from source: unknown 41445 1727204201.58728: calling self._execute() 41445 1727204201.58795: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204201.58799: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204201.58808: variable 'omit' from source: magic vars 41445 1727204201.59078: variable 'ansible_distribution_major_version' from source: facts 41445 1727204201.59090: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204201.59172: variable '__network_wireless_connections_defined' from source: role '' defaults 41445 1727204201.59309: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41445 1727204201.60770: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41445 1727204201.60817: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41445 1727204201.60844: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41445 1727204201.60869: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41445 1727204201.60891: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41445 1727204201.60953: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204201.60973: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204201.60992: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204201.61023: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204201.61031: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204201.61067: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204201.61084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204201.61100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204201.61131: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204201.61140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204201.61170: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204201.61190: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204201.61206: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204201.61233: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204201.61244: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204201.61356: variable 'network_connections' from source: task vars 41445 1727204201.61371: variable 'interface' from source: set_fact 41445 1727204201.61418: variable 'interface' from source: set_fact 41445 1727204201.61427: variable 'interface' from source: set_fact 41445 1727204201.61469: variable 'interface' from source: set_fact 41445 1727204201.61526: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41445 1727204201.61859: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41445 1727204201.61888: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41445 1727204201.61914: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41445 1727204201.61936: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41445 1727204201.61966: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41445 1727204201.61983: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41445 1727204201.62002: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204201.62026: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41445 1727204201.62070: variable '__network_team_connections_defined' from source: role '' defaults 41445 1727204201.62221: variable 'network_connections' from source: task vars 41445 1727204201.62225: variable 'interface' from source: set_fact 41445 1727204201.62270: variable 'interface' from source: set_fact 41445 1727204201.62276: variable 'interface' from source: set_fact 41445 1727204201.62322: variable 'interface' from source: set_fact 41445 1727204201.62350: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 41445 1727204201.62353: when evaluation is False, skipping this task 41445 1727204201.62356: _execute() done 41445 1727204201.62358: dumping result to json 41445 1727204201.62361: done dumping result, returning 41445 1727204201.62367: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [028d2410-947f-bf02-eee4-00000000006e] 41445 1727204201.62381: sending task result for task 028d2410-947f-bf02-eee4-00000000006e 41445 1727204201.62458: done sending task result for task 028d2410-947f-bf02-eee4-00000000006e 41445 1727204201.62461: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 41445 1727204201.62514: no more pending results, returning what we have 41445 1727204201.62517: results queue empty 41445 1727204201.62518: checking for any_errors_fatal 41445 1727204201.62522: done checking for any_errors_fatal 41445 1727204201.62522: checking for max_fail_percentage 41445 1727204201.62524: done checking for max_fail_percentage 41445 1727204201.62525: checking to see if all hosts have failed and the running result is not ok 41445 1727204201.62526: done checking to see if all hosts have failed 41445 1727204201.62526: getting the remaining hosts for this loop 41445 1727204201.62527: done getting the remaining hosts for this loop 41445 1727204201.62531: getting the next task for host managed-node3 41445 1727204201.62538: done getting next task for host managed-node3 41445 1727204201.62541: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 41445 1727204201.62543: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204201.62562: getting variables 41445 1727204201.62563: in VariableManager get_vars() 41445 1727204201.62611: Calling all_inventory to load vars for managed-node3 41445 1727204201.62614: Calling groups_inventory to load vars for managed-node3 41445 1727204201.62616: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204201.62625: Calling all_plugins_play to load vars for managed-node3 41445 1727204201.62628: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204201.62630: Calling groups_plugins_play to load vars for managed-node3 41445 1727204201.66781: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204201.67633: done with get_vars() 41445 1727204201.67650: done getting variables 41445 1727204201.67687: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:56:41 -0400 (0:00:00.095) 0:00:20.464 ***** 41445 1727204201.67709: entering _queue_task() for managed-node3/package 41445 1727204201.67963: worker is 1 (out of 1 available) 41445 1727204201.67974: exiting _queue_task() for managed-node3/package 41445 1727204201.67989: done queuing things up, now waiting for results queue to drain 41445 1727204201.67990: waiting for pending results... 41445 1727204201.68181: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages 41445 1727204201.68277: in run() - task 028d2410-947f-bf02-eee4-00000000006f 41445 1727204201.68291: variable 'ansible_search_path' from source: unknown 41445 1727204201.68295: variable 'ansible_search_path' from source: unknown 41445 1727204201.68328: calling self._execute() 41445 1727204201.68406: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204201.68410: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204201.68421: variable 'omit' from source: magic vars 41445 1727204201.68713: variable 'ansible_distribution_major_version' from source: facts 41445 1727204201.68727: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204201.68866: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41445 1727204201.69058: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41445 1727204201.69093: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41445 1727204201.69148: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41445 1727204201.69176: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41445 1727204201.69259: variable 'network_packages' from source: role '' defaults 41445 1727204201.69418: variable '__network_provider_setup' from source: role '' defaults 41445 1727204201.69421: variable '__network_service_name_default_nm' from source: role '' defaults 41445 1727204201.69424: variable '__network_service_name_default_nm' from source: role '' defaults 41445 1727204201.69427: variable '__network_packages_default_nm' from source: role '' defaults 41445 1727204201.69445: variable '__network_packages_default_nm' from source: role '' defaults 41445 1727204201.69562: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41445 1727204201.70895: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41445 1727204201.70939: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41445 1727204201.70967: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41445 1727204201.70992: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41445 1727204201.71022: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41445 1727204201.71081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204201.71103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204201.71123: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204201.71150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204201.71161: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204201.71196: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204201.71213: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204201.71232: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204201.71256: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204201.71267: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204201.71411: variable '__network_packages_default_gobject_packages' from source: role '' defaults 41445 1727204201.71482: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204201.71501: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204201.71521: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204201.71545: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204201.71555: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204201.71623: variable 'ansible_python' from source: facts 41445 1727204201.71642: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 41445 1727204201.71699: variable '__network_wpa_supplicant_required' from source: role '' defaults 41445 1727204201.71755: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 41445 1727204201.71842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204201.71858: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204201.71875: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204201.71900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204201.71915: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204201.71947: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204201.71967: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204201.71985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204201.72011: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204201.72022: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204201.72140: variable 'network_connections' from source: task vars 41445 1727204201.72143: variable 'interface' from source: set_fact 41445 1727204201.72195: variable 'interface' from source: set_fact 41445 1727204201.72203: variable 'interface' from source: set_fact 41445 1727204201.72273: variable 'interface' from source: set_fact 41445 1727204201.72329: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41445 1727204201.72348: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41445 1727204201.72373: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204201.72395: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41445 1727204201.72431: variable '__network_wireless_connections_defined' from source: role '' defaults 41445 1727204201.72608: variable 'network_connections' from source: task vars 41445 1727204201.72615: variable 'interface' from source: set_fact 41445 1727204201.72683: variable 'interface' from source: set_fact 41445 1727204201.72689: variable 'interface' from source: set_fact 41445 1727204201.72758: variable 'interface' from source: set_fact 41445 1727204201.72804: variable '__network_packages_default_wireless' from source: role '' defaults 41445 1727204201.72857: variable '__network_wireless_connections_defined' from source: role '' defaults 41445 1727204201.73050: variable 'network_connections' from source: task vars 41445 1727204201.73054: variable 'interface' from source: set_fact 41445 1727204201.73180: variable 'interface' from source: set_fact 41445 1727204201.73183: variable 'interface' from source: set_fact 41445 1727204201.73185: variable 'interface' from source: set_fact 41445 1727204201.73200: variable '__network_packages_default_team' from source: role '' defaults 41445 1727204201.73280: variable '__network_team_connections_defined' from source: role '' defaults 41445 1727204201.73579: variable 'network_connections' from source: task vars 41445 1727204201.73591: variable 'interface' from source: set_fact 41445 1727204201.73654: variable 'interface' from source: set_fact 41445 1727204201.73667: variable 'interface' from source: set_fact 41445 1727204201.73753: variable 'interface' from source: set_fact 41445 1727204201.73829: variable '__network_service_name_default_initscripts' from source: role '' defaults 41445 1727204201.73899: variable '__network_service_name_default_initscripts' from source: role '' defaults 41445 1727204201.73914: variable '__network_packages_default_initscripts' from source: role '' defaults 41445 1727204201.73982: variable '__network_packages_default_initscripts' from source: role '' defaults 41445 1727204201.74202: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 41445 1727204201.74706: variable 'network_connections' from source: task vars 41445 1727204201.74782: variable 'interface' from source: set_fact 41445 1727204201.74785: variable 'interface' from source: set_fact 41445 1727204201.74789: variable 'interface' from source: set_fact 41445 1727204201.74850: variable 'interface' from source: set_fact 41445 1727204201.74870: variable 'ansible_distribution' from source: facts 41445 1727204201.74882: variable '__network_rh_distros' from source: role '' defaults 41445 1727204201.74893: variable 'ansible_distribution_major_version' from source: facts 41445 1727204201.74920: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 41445 1727204201.75082: variable 'ansible_distribution' from source: facts 41445 1727204201.75092: variable '__network_rh_distros' from source: role '' defaults 41445 1727204201.75103: variable 'ansible_distribution_major_version' from source: facts 41445 1727204201.75353: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 41445 1727204201.75433: variable 'ansible_distribution' from source: facts 41445 1727204201.75444: variable '__network_rh_distros' from source: role '' defaults 41445 1727204201.75457: variable 'ansible_distribution_major_version' from source: facts 41445 1727204201.75504: variable 'network_provider' from source: set_fact 41445 1727204201.75527: variable 'ansible_facts' from source: unknown 41445 1727204201.76236: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 41445 1727204201.76245: when evaluation is False, skipping this task 41445 1727204201.76251: _execute() done 41445 1727204201.76257: dumping result to json 41445 1727204201.76262: done dumping result, returning 41445 1727204201.76274: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages [028d2410-947f-bf02-eee4-00000000006f] 41445 1727204201.76287: sending task result for task 028d2410-947f-bf02-eee4-00000000006f skipping: [managed-node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 41445 1727204201.76486: no more pending results, returning what we have 41445 1727204201.76491: results queue empty 41445 1727204201.76492: checking for any_errors_fatal 41445 1727204201.76501: done checking for any_errors_fatal 41445 1727204201.76502: checking for max_fail_percentage 41445 1727204201.76504: done checking for max_fail_percentage 41445 1727204201.76505: checking to see if all hosts have failed and the running result is not ok 41445 1727204201.76506: done checking to see if all hosts have failed 41445 1727204201.76506: getting the remaining hosts for this loop 41445 1727204201.76508: done getting the remaining hosts for this loop 41445 1727204201.76515: getting the next task for host managed-node3 41445 1727204201.76523: done getting next task for host managed-node3 41445 1727204201.76527: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 41445 1727204201.76535: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204201.76556: getting variables 41445 1727204201.76558: in VariableManager get_vars() 41445 1727204201.76658: Calling all_inventory to load vars for managed-node3 41445 1727204201.76666: Calling groups_inventory to load vars for managed-node3 41445 1727204201.76668: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204201.76682: Calling all_plugins_play to load vars for managed-node3 41445 1727204201.76686: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204201.76688: Calling groups_plugins_play to load vars for managed-node3 41445 1727204201.77289: done sending task result for task 028d2410-947f-bf02-eee4-00000000006f 41445 1727204201.77293: WORKER PROCESS EXITING 41445 1727204201.78052: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204201.78942: done with get_vars() 41445 1727204201.78958: done getting variables 41445 1727204201.79004: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:56:41 -0400 (0:00:00.113) 0:00:20.578 ***** 41445 1727204201.79029: entering _queue_task() for managed-node3/package 41445 1727204201.79260: worker is 1 (out of 1 available) 41445 1727204201.79274: exiting _queue_task() for managed-node3/package 41445 1727204201.79286: done queuing things up, now waiting for results queue to drain 41445 1727204201.79287: waiting for pending results... 41445 1727204201.79470: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 41445 1727204201.79565: in run() - task 028d2410-947f-bf02-eee4-000000000070 41445 1727204201.79579: variable 'ansible_search_path' from source: unknown 41445 1727204201.79584: variable 'ansible_search_path' from source: unknown 41445 1727204201.79618: calling self._execute() 41445 1727204201.79691: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204201.79698: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204201.79707: variable 'omit' from source: magic vars 41445 1727204201.79993: variable 'ansible_distribution_major_version' from source: facts 41445 1727204201.80002: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204201.80085: variable 'network_state' from source: role '' defaults 41445 1727204201.80094: Evaluated conditional (network_state != {}): False 41445 1727204201.80097: when evaluation is False, skipping this task 41445 1727204201.80100: _execute() done 41445 1727204201.80103: dumping result to json 41445 1727204201.80105: done dumping result, returning 41445 1727204201.80114: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [028d2410-947f-bf02-eee4-000000000070] 41445 1727204201.80121: sending task result for task 028d2410-947f-bf02-eee4-000000000070 skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 41445 1727204201.80256: no more pending results, returning what we have 41445 1727204201.80260: results queue empty 41445 1727204201.80261: checking for any_errors_fatal 41445 1727204201.80267: done checking for any_errors_fatal 41445 1727204201.80267: checking for max_fail_percentage 41445 1727204201.80269: done checking for max_fail_percentage 41445 1727204201.80270: checking to see if all hosts have failed and the running result is not ok 41445 1727204201.80271: done checking to see if all hosts have failed 41445 1727204201.80271: getting the remaining hosts for this loop 41445 1727204201.80272: done getting the remaining hosts for this loop 41445 1727204201.80277: getting the next task for host managed-node3 41445 1727204201.80284: done getting next task for host managed-node3 41445 1727204201.80288: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 41445 1727204201.80291: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204201.80310: getting variables 41445 1727204201.80312: in VariableManager get_vars() 41445 1727204201.80347: Calling all_inventory to load vars for managed-node3 41445 1727204201.80349: Calling groups_inventory to load vars for managed-node3 41445 1727204201.80351: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204201.80360: Calling all_plugins_play to load vars for managed-node3 41445 1727204201.80362: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204201.80365: Calling groups_plugins_play to load vars for managed-node3 41445 1727204201.81230: done sending task result for task 028d2410-947f-bf02-eee4-000000000070 41445 1727204201.81234: WORKER PROCESS EXITING 41445 1727204201.81244: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204201.82114: done with get_vars() 41445 1727204201.82129: done getting variables 41445 1727204201.82169: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:56:41 -0400 (0:00:00.031) 0:00:20.609 ***** 41445 1727204201.82192: entering _queue_task() for managed-node3/package 41445 1727204201.82408: worker is 1 (out of 1 available) 41445 1727204201.82421: exiting _queue_task() for managed-node3/package 41445 1727204201.82432: done queuing things up, now waiting for results queue to drain 41445 1727204201.82434: waiting for pending results... 41445 1727204201.82612: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 41445 1727204201.82700: in run() - task 028d2410-947f-bf02-eee4-000000000071 41445 1727204201.82712: variable 'ansible_search_path' from source: unknown 41445 1727204201.82717: variable 'ansible_search_path' from source: unknown 41445 1727204201.82747: calling self._execute() 41445 1727204201.82821: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204201.82825: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204201.82833: variable 'omit' from source: magic vars 41445 1727204201.83107: variable 'ansible_distribution_major_version' from source: facts 41445 1727204201.83119: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204201.83203: variable 'network_state' from source: role '' defaults 41445 1727204201.83213: Evaluated conditional (network_state != {}): False 41445 1727204201.83217: when evaluation is False, skipping this task 41445 1727204201.83219: _execute() done 41445 1727204201.83222: dumping result to json 41445 1727204201.83224: done dumping result, returning 41445 1727204201.83230: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [028d2410-947f-bf02-eee4-000000000071] 41445 1727204201.83236: sending task result for task 028d2410-947f-bf02-eee4-000000000071 41445 1727204201.83331: done sending task result for task 028d2410-947f-bf02-eee4-000000000071 41445 1727204201.83334: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 41445 1727204201.83380: no more pending results, returning what we have 41445 1727204201.83383: results queue empty 41445 1727204201.83384: checking for any_errors_fatal 41445 1727204201.83393: done checking for any_errors_fatal 41445 1727204201.83394: checking for max_fail_percentage 41445 1727204201.83395: done checking for max_fail_percentage 41445 1727204201.83396: checking to see if all hosts have failed and the running result is not ok 41445 1727204201.83397: done checking to see if all hosts have failed 41445 1727204201.83398: getting the remaining hosts for this loop 41445 1727204201.83399: done getting the remaining hosts for this loop 41445 1727204201.83402: getting the next task for host managed-node3 41445 1727204201.83408: done getting next task for host managed-node3 41445 1727204201.83411: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 41445 1727204201.83413: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204201.83429: getting variables 41445 1727204201.83431: in VariableManager get_vars() 41445 1727204201.83463: Calling all_inventory to load vars for managed-node3 41445 1727204201.83465: Calling groups_inventory to load vars for managed-node3 41445 1727204201.83467: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204201.83474: Calling all_plugins_play to load vars for managed-node3 41445 1727204201.83479: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204201.83482: Calling groups_plugins_play to load vars for managed-node3 41445 1727204201.84217: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204201.85101: done with get_vars() 41445 1727204201.85117: done getting variables 41445 1727204201.85161: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:56:41 -0400 (0:00:00.029) 0:00:20.639 ***** 41445 1727204201.85185: entering _queue_task() for managed-node3/service 41445 1727204201.85390: worker is 1 (out of 1 available) 41445 1727204201.85404: exiting _queue_task() for managed-node3/service 41445 1727204201.85414: done queuing things up, now waiting for results queue to drain 41445 1727204201.85416: waiting for pending results... 41445 1727204201.85589: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 41445 1727204201.85679: in run() - task 028d2410-947f-bf02-eee4-000000000072 41445 1727204201.85691: variable 'ansible_search_path' from source: unknown 41445 1727204201.85694: variable 'ansible_search_path' from source: unknown 41445 1727204201.85723: calling self._execute() 41445 1727204201.85796: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204201.85800: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204201.85808: variable 'omit' from source: magic vars 41445 1727204201.86086: variable 'ansible_distribution_major_version' from source: facts 41445 1727204201.86097: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204201.86184: variable '__network_wireless_connections_defined' from source: role '' defaults 41445 1727204201.86318: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41445 1727204201.88019: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41445 1727204201.88073: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41445 1727204201.88101: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41445 1727204201.88128: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41445 1727204201.88148: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41445 1727204201.88215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204201.88235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204201.88252: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204201.88285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204201.88296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204201.88330: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204201.88347: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204201.88364: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204201.88394: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204201.88405: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204201.88433: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204201.88449: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204201.88465: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204201.88495: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204201.88506: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204201.88614: variable 'network_connections' from source: task vars 41445 1727204201.88622: variable 'interface' from source: set_fact 41445 1727204201.88672: variable 'interface' from source: set_fact 41445 1727204201.88681: variable 'interface' from source: set_fact 41445 1727204201.88728: variable 'interface' from source: set_fact 41445 1727204201.88779: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41445 1727204201.88885: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41445 1727204201.88915: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41445 1727204201.88946: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41445 1727204201.88967: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41445 1727204201.88998: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41445 1727204201.89015: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41445 1727204201.89035: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204201.89051: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41445 1727204201.89098: variable '__network_team_connections_defined' from source: role '' defaults 41445 1727204201.89245: variable 'network_connections' from source: task vars 41445 1727204201.89248: variable 'interface' from source: set_fact 41445 1727204201.89293: variable 'interface' from source: set_fact 41445 1727204201.89299: variable 'interface' from source: set_fact 41445 1727204201.89341: variable 'interface' from source: set_fact 41445 1727204201.89367: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 41445 1727204201.89374: when evaluation is False, skipping this task 41445 1727204201.89379: _execute() done 41445 1727204201.89382: dumping result to json 41445 1727204201.89384: done dumping result, returning 41445 1727204201.89391: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [028d2410-947f-bf02-eee4-000000000072] 41445 1727204201.89402: sending task result for task 028d2410-947f-bf02-eee4-000000000072 41445 1727204201.89484: done sending task result for task 028d2410-947f-bf02-eee4-000000000072 41445 1727204201.89487: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 41445 1727204201.89533: no more pending results, returning what we have 41445 1727204201.89536: results queue empty 41445 1727204201.89536: checking for any_errors_fatal 41445 1727204201.89544: done checking for any_errors_fatal 41445 1727204201.89544: checking for max_fail_percentage 41445 1727204201.89546: done checking for max_fail_percentage 41445 1727204201.89547: checking to see if all hosts have failed and the running result is not ok 41445 1727204201.89547: done checking to see if all hosts have failed 41445 1727204201.89548: getting the remaining hosts for this loop 41445 1727204201.89549: done getting the remaining hosts for this loop 41445 1727204201.89553: getting the next task for host managed-node3 41445 1727204201.89560: done getting next task for host managed-node3 41445 1727204201.89563: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 41445 1727204201.89565: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204201.89586: getting variables 41445 1727204201.89587: in VariableManager get_vars() 41445 1727204201.89633: Calling all_inventory to load vars for managed-node3 41445 1727204201.89636: Calling groups_inventory to load vars for managed-node3 41445 1727204201.89638: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204201.89648: Calling all_plugins_play to load vars for managed-node3 41445 1727204201.89650: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204201.89653: Calling groups_plugins_play to load vars for managed-node3 41445 1727204201.90547: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204201.91444: done with get_vars() 41445 1727204201.91461: done getting variables 41445 1727204201.91506: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:56:41 -0400 (0:00:00.063) 0:00:20.703 ***** 41445 1727204201.91532: entering _queue_task() for managed-node3/service 41445 1727204201.91778: worker is 1 (out of 1 available) 41445 1727204201.91793: exiting _queue_task() for managed-node3/service 41445 1727204201.91804: done queuing things up, now waiting for results queue to drain 41445 1727204201.91805: waiting for pending results... 41445 1727204201.91988: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 41445 1727204201.92077: in run() - task 028d2410-947f-bf02-eee4-000000000073 41445 1727204201.92089: variable 'ansible_search_path' from source: unknown 41445 1727204201.92092: variable 'ansible_search_path' from source: unknown 41445 1727204201.92120: calling self._execute() 41445 1727204201.92198: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204201.92202: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204201.92213: variable 'omit' from source: magic vars 41445 1727204201.92493: variable 'ansible_distribution_major_version' from source: facts 41445 1727204201.92502: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204201.92614: variable 'network_provider' from source: set_fact 41445 1727204201.92619: variable 'network_state' from source: role '' defaults 41445 1727204201.92626: Evaluated conditional (network_provider == "nm" or network_state != {}): True 41445 1727204201.92632: variable 'omit' from source: magic vars 41445 1727204201.92672: variable 'omit' from source: magic vars 41445 1727204201.92698: variable 'network_service_name' from source: role '' defaults 41445 1727204201.92751: variable 'network_service_name' from source: role '' defaults 41445 1727204201.92826: variable '__network_provider_setup' from source: role '' defaults 41445 1727204201.92829: variable '__network_service_name_default_nm' from source: role '' defaults 41445 1727204201.92874: variable '__network_service_name_default_nm' from source: role '' defaults 41445 1727204201.92884: variable '__network_packages_default_nm' from source: role '' defaults 41445 1727204201.92932: variable '__network_packages_default_nm' from source: role '' defaults 41445 1727204201.93072: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41445 1727204201.94493: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41445 1727204201.94543: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41445 1727204201.94574: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41445 1727204201.94603: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41445 1727204201.94625: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41445 1727204201.94690: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204201.94711: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204201.94731: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204201.94757: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204201.94775: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204201.94807: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204201.94827: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204201.94843: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204201.94873: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204201.94883: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204201.95023: variable '__network_packages_default_gobject_packages' from source: role '' defaults 41445 1727204201.95099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204201.95119: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204201.95136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204201.95160: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204201.95170: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204201.95236: variable 'ansible_python' from source: facts 41445 1727204201.95253: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 41445 1727204201.95310: variable '__network_wpa_supplicant_required' from source: role '' defaults 41445 1727204201.95363: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 41445 1727204201.95450: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204201.95467: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204201.95485: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204201.95512: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204201.95526: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204201.95559: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204201.95579: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204201.95596: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204201.95628: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204201.95635: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204201.95727: variable 'network_connections' from source: task vars 41445 1727204201.95733: variable 'interface' from source: set_fact 41445 1727204201.95788: variable 'interface' from source: set_fact 41445 1727204201.95797: variable 'interface' from source: set_fact 41445 1727204201.95853: variable 'interface' from source: set_fact 41445 1727204201.95942: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41445 1727204201.96073: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41445 1727204201.96109: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41445 1727204201.96142: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41445 1727204201.96172: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41445 1727204201.96218: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41445 1727204201.96238: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41445 1727204201.96260: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204201.96286: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41445 1727204201.96325: variable '__network_wireless_connections_defined' from source: role '' defaults 41445 1727204201.96501: variable 'network_connections' from source: task vars 41445 1727204201.96507: variable 'interface' from source: set_fact 41445 1727204201.96561: variable 'interface' from source: set_fact 41445 1727204201.96569: variable 'interface' from source: set_fact 41445 1727204201.96628: variable 'interface' from source: set_fact 41445 1727204201.96679: variable '__network_packages_default_wireless' from source: role '' defaults 41445 1727204201.96733: variable '__network_wireless_connections_defined' from source: role '' defaults 41445 1727204201.96920: variable 'network_connections' from source: task vars 41445 1727204201.96923: variable 'interface' from source: set_fact 41445 1727204201.96974: variable 'interface' from source: set_fact 41445 1727204201.96981: variable 'interface' from source: set_fact 41445 1727204201.97030: variable 'interface' from source: set_fact 41445 1727204201.97051: variable '__network_packages_default_team' from source: role '' defaults 41445 1727204201.97111: variable '__network_team_connections_defined' from source: role '' defaults 41445 1727204201.97291: variable 'network_connections' from source: task vars 41445 1727204201.97294: variable 'interface' from source: set_fact 41445 1727204201.97343: variable 'interface' from source: set_fact 41445 1727204201.97349: variable 'interface' from source: set_fact 41445 1727204201.97400: variable 'interface' from source: set_fact 41445 1727204201.97444: variable '__network_service_name_default_initscripts' from source: role '' defaults 41445 1727204201.97487: variable '__network_service_name_default_initscripts' from source: role '' defaults 41445 1727204201.97499: variable '__network_packages_default_initscripts' from source: role '' defaults 41445 1727204201.97538: variable '__network_packages_default_initscripts' from source: role '' defaults 41445 1727204201.97668: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 41445 1727204201.98112: variable 'network_connections' from source: task vars 41445 1727204201.98116: variable 'interface' from source: set_fact 41445 1727204201.98156: variable 'interface' from source: set_fact 41445 1727204201.98165: variable 'interface' from source: set_fact 41445 1727204201.98207: variable 'interface' from source: set_fact 41445 1727204201.98221: variable 'ansible_distribution' from source: facts 41445 1727204201.98224: variable '__network_rh_distros' from source: role '' defaults 41445 1727204201.98229: variable 'ansible_distribution_major_version' from source: facts 41445 1727204201.98246: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 41445 1727204201.98378: variable 'ansible_distribution' from source: facts 41445 1727204201.98381: variable '__network_rh_distros' from source: role '' defaults 41445 1727204201.98387: variable 'ansible_distribution_major_version' from source: facts 41445 1727204201.98398: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 41445 1727204201.98514: variable 'ansible_distribution' from source: facts 41445 1727204201.98518: variable '__network_rh_distros' from source: role '' defaults 41445 1727204201.98520: variable 'ansible_distribution_major_version' from source: facts 41445 1727204201.98544: variable 'network_provider' from source: set_fact 41445 1727204201.98560: variable 'omit' from source: magic vars 41445 1727204201.98581: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41445 1727204201.98606: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41445 1727204201.98620: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41445 1727204201.98633: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204201.98642: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204201.98664: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41445 1727204201.98666: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204201.98669: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204201.98740: Set connection var ansible_shell_executable to /bin/sh 41445 1727204201.98743: Set connection var ansible_shell_type to sh 41445 1727204201.98746: Set connection var ansible_pipelining to False 41445 1727204201.98753: Set connection var ansible_timeout to 10 41445 1727204201.98755: Set connection var ansible_connection to ssh 41445 1727204201.98761: Set connection var ansible_module_compression to ZIP_DEFLATED 41445 1727204201.98783: variable 'ansible_shell_executable' from source: unknown 41445 1727204201.98786: variable 'ansible_connection' from source: unknown 41445 1727204201.98789: variable 'ansible_module_compression' from source: unknown 41445 1727204201.98791: variable 'ansible_shell_type' from source: unknown 41445 1727204201.98793: variable 'ansible_shell_executable' from source: unknown 41445 1727204201.98795: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204201.98804: variable 'ansible_pipelining' from source: unknown 41445 1727204201.98806: variable 'ansible_timeout' from source: unknown 41445 1727204201.98808: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204201.98873: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41445 1727204201.98885: variable 'omit' from source: magic vars 41445 1727204201.98889: starting attempt loop 41445 1727204201.98892: running the handler 41445 1727204201.98948: variable 'ansible_facts' from source: unknown 41445 1727204201.99384: _low_level_execute_command(): starting 41445 1727204201.99389: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41445 1727204201.99874: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204201.99880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204201.99883: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204201.99885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204201.99939: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204201.99942: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204201.99945: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204201.99994: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204202.01651: stdout chunk (state=3): >>>/root <<< 41445 1727204202.01754: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204202.01784: stderr chunk (state=3): >>><<< 41445 1727204202.01788: stdout chunk (state=3): >>><<< 41445 1727204202.01805: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204202.01818: _low_level_execute_command(): starting 41445 1727204202.01823: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204202.0180535-42898-61837860941519 `" && echo ansible-tmp-1727204202.0180535-42898-61837860941519="` echo /root/.ansible/tmp/ansible-tmp-1727204202.0180535-42898-61837860941519 `" ) && sleep 0' 41445 1727204202.02255: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204202.02258: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found <<< 41445 1727204202.02261: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204202.02263: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204202.02265: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204202.02323: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204202.02328: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204202.02330: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204202.02359: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204202.04208: stdout chunk (state=3): >>>ansible-tmp-1727204202.0180535-42898-61837860941519=/root/.ansible/tmp/ansible-tmp-1727204202.0180535-42898-61837860941519 <<< 41445 1727204202.04318: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204202.04343: stderr chunk (state=3): >>><<< 41445 1727204202.04346: stdout chunk (state=3): >>><<< 41445 1727204202.04358: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204202.0180535-42898-61837860941519=/root/.ansible/tmp/ansible-tmp-1727204202.0180535-42898-61837860941519 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204202.04386: variable 'ansible_module_compression' from source: unknown 41445 1727204202.04429: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-414450s0ylvj1/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 41445 1727204202.04487: variable 'ansible_facts' from source: unknown 41445 1727204202.04594: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204202.0180535-42898-61837860941519/AnsiballZ_systemd.py 41445 1727204202.04695: Sending initial data 41445 1727204202.04699: Sent initial data (155 bytes) 41445 1727204202.05144: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204202.05147: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204202.05150: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204202.05152: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 <<< 41445 1727204202.05154: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204202.05207: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204202.05213: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204202.05242: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204202.06799: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 41445 1727204202.06809: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41445 1727204202.06835: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41445 1727204202.06870: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-414450s0ylvj1/tmpn6rryaj0 /root/.ansible/tmp/ansible-tmp-1727204202.0180535-42898-61837860941519/AnsiballZ_systemd.py <<< 41445 1727204202.06878: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204202.0180535-42898-61837860941519/AnsiballZ_systemd.py" <<< 41445 1727204202.06903: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-414450s0ylvj1/tmpn6rryaj0" to remote "/root/.ansible/tmp/ansible-tmp-1727204202.0180535-42898-61837860941519/AnsiballZ_systemd.py" <<< 41445 1727204202.06910: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204202.0180535-42898-61837860941519/AnsiballZ_systemd.py" <<< 41445 1727204202.07961: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204202.08002: stderr chunk (state=3): >>><<< 41445 1727204202.08005: stdout chunk (state=3): >>><<< 41445 1727204202.08028: done transferring module to remote 41445 1727204202.08037: _low_level_execute_command(): starting 41445 1727204202.08041: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204202.0180535-42898-61837860941519/ /root/.ansible/tmp/ansible-tmp-1727204202.0180535-42898-61837860941519/AnsiballZ_systemd.py && sleep 0' 41445 1727204202.08477: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204202.08481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204202.08483: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204202.08485: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found <<< 41445 1727204202.08487: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204202.08532: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204202.08535: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204202.08572: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204202.10306: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204202.10329: stderr chunk (state=3): >>><<< 41445 1727204202.10332: stdout chunk (state=3): >>><<< 41445 1727204202.10343: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204202.10346: _low_level_execute_command(): starting 41445 1727204202.10350: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204202.0180535-42898-61837860941519/AnsiballZ_systemd.py && sleep 0' 41445 1727204202.10763: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204202.10766: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found <<< 41445 1727204202.10768: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204202.10774: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found <<< 41445 1727204202.10778: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204202.10823: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204202.10831: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204202.10863: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204202.39461: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "704", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:44:11 EDT", "ExecMainStartTimestampMonotonic": "28990148", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Tue 2024-09-24 14:44:11 EDT", "ExecMainHandoffTimestampMonotonic": "29005881", "ExecMainPID": "704", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10510336", "MemoryPeak": "13586432", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3290546176", "EffectiveMemoryMax": "3702865920", "EffectiveMemoryHigh": "3702865920", "CPUUsageNSec": "1827545000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpR<<< 41445 1727204202.39505: stdout chunk (state=3): >>>eceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service shutdown.target NetworkManager-wait-online.service multi-user.target network.target", "After": "dbus-broker.service systemd-journald.socket network-pre.target basic.target cloud-init-local.service dbus.socket system.slice sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:49:45 EDT", "StateChangeTimestampMonotonic": "362725592", "InactiveExitTimestamp": "Tue 2024-09-24 14:44:11 EDT", "InactiveExitTimestampMonotonic": "28990654", "ActiveEnterTimestamp": "Tue 2024-09-24 14:44:12 EDT", "ActiveEnterTimestampMonotonic": "29769382", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:44:11 EDT", "ConditionTimestampMonotonic": "28989295", "AssertTimestamp": "Tue 2024-09-24 14:44:11 EDT", "AssertTimestampMonotonic": "28989297", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "70a845f8a1964db89963090ed497f47f", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 41445 1727204202.41307: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204202.41323: stderr chunk (state=3): >>>Shared connection to 10.31.47.22 closed. <<< 41445 1727204202.41348: stderr chunk (state=3): >>><<< 41445 1727204202.41351: stdout chunk (state=3): >>><<< 41445 1727204202.41368: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "704", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:44:11 EDT", "ExecMainStartTimestampMonotonic": "28990148", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Tue 2024-09-24 14:44:11 EDT", "ExecMainHandoffTimestampMonotonic": "29005881", "ExecMainPID": "704", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10510336", "MemoryPeak": "13586432", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3290546176", "EffectiveMemoryMax": "3702865920", "EffectiveMemoryHigh": "3702865920", "CPUUsageNSec": "1827545000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service shutdown.target NetworkManager-wait-online.service multi-user.target network.target", "After": "dbus-broker.service systemd-journald.socket network-pre.target basic.target cloud-init-local.service dbus.socket system.slice sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:49:45 EDT", "StateChangeTimestampMonotonic": "362725592", "InactiveExitTimestamp": "Tue 2024-09-24 14:44:11 EDT", "InactiveExitTimestampMonotonic": "28990654", "ActiveEnterTimestamp": "Tue 2024-09-24 14:44:12 EDT", "ActiveEnterTimestampMonotonic": "29769382", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:44:11 EDT", "ConditionTimestampMonotonic": "28989295", "AssertTimestamp": "Tue 2024-09-24 14:44:11 EDT", "AssertTimestampMonotonic": "28989297", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "70a845f8a1964db89963090ed497f47f", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. 41445 1727204202.41488: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204202.0180535-42898-61837860941519/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41445 1727204202.41493: _low_level_execute_command(): starting 41445 1727204202.41499: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204202.0180535-42898-61837860941519/ > /dev/null 2>&1 && sleep 0' 41445 1727204202.42187: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204202.42192: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204202.42195: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204202.42198: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204202.42225: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204202.44004: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204202.44029: stderr chunk (state=3): >>><<< 41445 1727204202.44032: stdout chunk (state=3): >>><<< 41445 1727204202.44048: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204202.44054: handler run complete 41445 1727204202.44092: attempt loop complete, returning result 41445 1727204202.44095: _execute() done 41445 1727204202.44097: dumping result to json 41445 1727204202.44111: done dumping result, returning 41445 1727204202.44118: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [028d2410-947f-bf02-eee4-000000000073] 41445 1727204202.44124: sending task result for task 028d2410-947f-bf02-eee4-000000000073 41445 1727204202.44350: done sending task result for task 028d2410-947f-bf02-eee4-000000000073 41445 1727204202.44353: WORKER PROCESS EXITING ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 41445 1727204202.44413: no more pending results, returning what we have 41445 1727204202.44416: results queue empty 41445 1727204202.44417: checking for any_errors_fatal 41445 1727204202.44423: done checking for any_errors_fatal 41445 1727204202.44424: checking for max_fail_percentage 41445 1727204202.44425: done checking for max_fail_percentage 41445 1727204202.44426: checking to see if all hosts have failed and the running result is not ok 41445 1727204202.44427: done checking to see if all hosts have failed 41445 1727204202.44427: getting the remaining hosts for this loop 41445 1727204202.44429: done getting the remaining hosts for this loop 41445 1727204202.44432: getting the next task for host managed-node3 41445 1727204202.44438: done getting next task for host managed-node3 41445 1727204202.44441: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 41445 1727204202.44443: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204202.44453: getting variables 41445 1727204202.44455: in VariableManager get_vars() 41445 1727204202.44494: Calling all_inventory to load vars for managed-node3 41445 1727204202.44496: Calling groups_inventory to load vars for managed-node3 41445 1727204202.44498: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204202.44507: Calling all_plugins_play to load vars for managed-node3 41445 1727204202.44512: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204202.44515: Calling groups_plugins_play to load vars for managed-node3 41445 1727204202.45394: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204202.46286: done with get_vars() 41445 1727204202.46303: done getting variables 41445 1727204202.46349: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:56:42 -0400 (0:00:00.548) 0:00:21.251 ***** 41445 1727204202.46371: entering _queue_task() for managed-node3/service 41445 1727204202.46616: worker is 1 (out of 1 available) 41445 1727204202.46629: exiting _queue_task() for managed-node3/service 41445 1727204202.46639: done queuing things up, now waiting for results queue to drain 41445 1727204202.46641: waiting for pending results... 41445 1727204202.46824: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 41445 1727204202.46981: in run() - task 028d2410-947f-bf02-eee4-000000000074 41445 1727204202.46985: variable 'ansible_search_path' from source: unknown 41445 1727204202.46988: variable 'ansible_search_path' from source: unknown 41445 1727204202.46991: calling self._execute() 41445 1727204202.47037: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204202.47041: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204202.47049: variable 'omit' from source: magic vars 41445 1727204202.47336: variable 'ansible_distribution_major_version' from source: facts 41445 1727204202.47345: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204202.47426: variable 'network_provider' from source: set_fact 41445 1727204202.47431: Evaluated conditional (network_provider == "nm"): True 41445 1727204202.47493: variable '__network_wpa_supplicant_required' from source: role '' defaults 41445 1727204202.47560: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 41445 1727204202.47678: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41445 1727204202.49093: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41445 1727204202.49139: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41445 1727204202.49166: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41445 1727204202.49194: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41445 1727204202.49218: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41445 1727204202.49291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204202.49311: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204202.49331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204202.49356: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204202.49369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204202.49405: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204202.49423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204202.49440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204202.49464: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204202.49481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204202.49513: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204202.49528: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204202.49544: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204202.49568: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204202.49586: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204202.49676: variable 'network_connections' from source: task vars 41445 1727204202.49687: variable 'interface' from source: set_fact 41445 1727204202.49738: variable 'interface' from source: set_fact 41445 1727204202.49746: variable 'interface' from source: set_fact 41445 1727204202.49789: variable 'interface' from source: set_fact 41445 1727204202.49843: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41445 1727204202.49953: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41445 1727204202.49979: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41445 1727204202.50002: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41445 1727204202.50027: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41445 1727204202.50082: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41445 1727204202.50085: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41445 1727204202.50088: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204202.50104: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41445 1727204202.50144: variable '__network_wireless_connections_defined' from source: role '' defaults 41445 1727204202.50306: variable 'network_connections' from source: task vars 41445 1727204202.50312: variable 'interface' from source: set_fact 41445 1727204202.50353: variable 'interface' from source: set_fact 41445 1727204202.50363: variable 'interface' from source: set_fact 41445 1727204202.50406: variable 'interface' from source: set_fact 41445 1727204202.50439: Evaluated conditional (__network_wpa_supplicant_required): False 41445 1727204202.50443: when evaluation is False, skipping this task 41445 1727204202.50445: _execute() done 41445 1727204202.50457: dumping result to json 41445 1727204202.50460: done dumping result, returning 41445 1727204202.50463: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [028d2410-947f-bf02-eee4-000000000074] 41445 1727204202.50465: sending task result for task 028d2410-947f-bf02-eee4-000000000074 41445 1727204202.50547: done sending task result for task 028d2410-947f-bf02-eee4-000000000074 41445 1727204202.50550: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 41445 1727204202.50621: no more pending results, returning what we have 41445 1727204202.50625: results queue empty 41445 1727204202.50626: checking for any_errors_fatal 41445 1727204202.50644: done checking for any_errors_fatal 41445 1727204202.50645: checking for max_fail_percentage 41445 1727204202.50646: done checking for max_fail_percentage 41445 1727204202.50647: checking to see if all hosts have failed and the running result is not ok 41445 1727204202.50648: done checking to see if all hosts have failed 41445 1727204202.50648: getting the remaining hosts for this loop 41445 1727204202.50650: done getting the remaining hosts for this loop 41445 1727204202.50653: getting the next task for host managed-node3 41445 1727204202.50659: done getting next task for host managed-node3 41445 1727204202.50662: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 41445 1727204202.50665: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204202.50685: getting variables 41445 1727204202.50687: in VariableManager get_vars() 41445 1727204202.50724: Calling all_inventory to load vars for managed-node3 41445 1727204202.50727: Calling groups_inventory to load vars for managed-node3 41445 1727204202.50729: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204202.50738: Calling all_plugins_play to load vars for managed-node3 41445 1727204202.50740: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204202.50743: Calling groups_plugins_play to load vars for managed-node3 41445 1727204202.51514: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204202.52383: done with get_vars() 41445 1727204202.52399: done getting variables 41445 1727204202.52442: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:56:42 -0400 (0:00:00.060) 0:00:21.312 ***** 41445 1727204202.52463: entering _queue_task() for managed-node3/service 41445 1727204202.52691: worker is 1 (out of 1 available) 41445 1727204202.52706: exiting _queue_task() for managed-node3/service 41445 1727204202.52717: done queuing things up, now waiting for results queue to drain 41445 1727204202.52718: waiting for pending results... 41445 1727204202.52898: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service 41445 1727204202.52986: in run() - task 028d2410-947f-bf02-eee4-000000000075 41445 1727204202.52997: variable 'ansible_search_path' from source: unknown 41445 1727204202.53001: variable 'ansible_search_path' from source: unknown 41445 1727204202.53030: calling self._execute() 41445 1727204202.53104: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204202.53108: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204202.53119: variable 'omit' from source: magic vars 41445 1727204202.53391: variable 'ansible_distribution_major_version' from source: facts 41445 1727204202.53402: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204202.53480: variable 'network_provider' from source: set_fact 41445 1727204202.53492: Evaluated conditional (network_provider == "initscripts"): False 41445 1727204202.53495: when evaluation is False, skipping this task 41445 1727204202.53498: _execute() done 41445 1727204202.53500: dumping result to json 41445 1727204202.53504: done dumping result, returning 41445 1727204202.53506: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service [028d2410-947f-bf02-eee4-000000000075] 41445 1727204202.53509: sending task result for task 028d2410-947f-bf02-eee4-000000000075 41445 1727204202.53595: done sending task result for task 028d2410-947f-bf02-eee4-000000000075 41445 1727204202.53598: WORKER PROCESS EXITING skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 41445 1727204202.53640: no more pending results, returning what we have 41445 1727204202.53644: results queue empty 41445 1727204202.53645: checking for any_errors_fatal 41445 1727204202.53656: done checking for any_errors_fatal 41445 1727204202.53657: checking for max_fail_percentage 41445 1727204202.53659: done checking for max_fail_percentage 41445 1727204202.53660: checking to see if all hosts have failed and the running result is not ok 41445 1727204202.53661: done checking to see if all hosts have failed 41445 1727204202.53661: getting the remaining hosts for this loop 41445 1727204202.53663: done getting the remaining hosts for this loop 41445 1727204202.53666: getting the next task for host managed-node3 41445 1727204202.53672: done getting next task for host managed-node3 41445 1727204202.53678: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 41445 1727204202.53680: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204202.53699: getting variables 41445 1727204202.53700: in VariableManager get_vars() 41445 1727204202.53735: Calling all_inventory to load vars for managed-node3 41445 1727204202.53738: Calling groups_inventory to load vars for managed-node3 41445 1727204202.53740: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204202.53748: Calling all_plugins_play to load vars for managed-node3 41445 1727204202.53751: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204202.53753: Calling groups_plugins_play to load vars for managed-node3 41445 1727204202.54620: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204202.55486: done with get_vars() 41445 1727204202.55501: done getting variables 41445 1727204202.55544: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:56:42 -0400 (0:00:00.031) 0:00:21.343 ***** 41445 1727204202.55566: entering _queue_task() for managed-node3/copy 41445 1727204202.55781: worker is 1 (out of 1 available) 41445 1727204202.55796: exiting _queue_task() for managed-node3/copy 41445 1727204202.55807: done queuing things up, now waiting for results queue to drain 41445 1727204202.55809: waiting for pending results... 41445 1727204202.55981: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 41445 1727204202.56068: in run() - task 028d2410-947f-bf02-eee4-000000000076 41445 1727204202.56081: variable 'ansible_search_path' from source: unknown 41445 1727204202.56084: variable 'ansible_search_path' from source: unknown 41445 1727204202.56113: calling self._execute() 41445 1727204202.56185: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204202.56190: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204202.56198: variable 'omit' from source: magic vars 41445 1727204202.56468: variable 'ansible_distribution_major_version' from source: facts 41445 1727204202.56480: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204202.56557: variable 'network_provider' from source: set_fact 41445 1727204202.56562: Evaluated conditional (network_provider == "initscripts"): False 41445 1727204202.56564: when evaluation is False, skipping this task 41445 1727204202.56567: _execute() done 41445 1727204202.56570: dumping result to json 41445 1727204202.56573: done dumping result, returning 41445 1727204202.56586: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [028d2410-947f-bf02-eee4-000000000076] 41445 1727204202.56589: sending task result for task 028d2410-947f-bf02-eee4-000000000076 41445 1727204202.56678: done sending task result for task 028d2410-947f-bf02-eee4-000000000076 41445 1727204202.56681: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 41445 1727204202.56733: no more pending results, returning what we have 41445 1727204202.56737: results queue empty 41445 1727204202.56738: checking for any_errors_fatal 41445 1727204202.56743: done checking for any_errors_fatal 41445 1727204202.56744: checking for max_fail_percentage 41445 1727204202.56745: done checking for max_fail_percentage 41445 1727204202.56746: checking to see if all hosts have failed and the running result is not ok 41445 1727204202.56747: done checking to see if all hosts have failed 41445 1727204202.56748: getting the remaining hosts for this loop 41445 1727204202.56749: done getting the remaining hosts for this loop 41445 1727204202.56752: getting the next task for host managed-node3 41445 1727204202.56758: done getting next task for host managed-node3 41445 1727204202.56761: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 41445 1727204202.56764: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204202.56782: getting variables 41445 1727204202.56784: in VariableManager get_vars() 41445 1727204202.56819: Calling all_inventory to load vars for managed-node3 41445 1727204202.56821: Calling groups_inventory to load vars for managed-node3 41445 1727204202.56823: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204202.56831: Calling all_plugins_play to load vars for managed-node3 41445 1727204202.56833: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204202.56836: Calling groups_plugins_play to load vars for managed-node3 41445 1727204202.57583: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204202.58551: done with get_vars() 41445 1727204202.58565: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:56:42 -0400 (0:00:00.030) 0:00:21.374 ***** 41445 1727204202.58629: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 41445 1727204202.58841: worker is 1 (out of 1 available) 41445 1727204202.58855: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 41445 1727204202.58867: done queuing things up, now waiting for results queue to drain 41445 1727204202.58868: waiting for pending results... 41445 1727204202.59045: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 41445 1727204202.59132: in run() - task 028d2410-947f-bf02-eee4-000000000077 41445 1727204202.59143: variable 'ansible_search_path' from source: unknown 41445 1727204202.59147: variable 'ansible_search_path' from source: unknown 41445 1727204202.59173: calling self._execute() 41445 1727204202.59247: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204202.59251: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204202.59260: variable 'omit' from source: magic vars 41445 1727204202.59528: variable 'ansible_distribution_major_version' from source: facts 41445 1727204202.59635: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204202.59641: variable 'omit' from source: magic vars 41445 1727204202.59643: variable 'omit' from source: magic vars 41445 1727204202.59690: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41445 1727204202.61118: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41445 1727204202.61164: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41445 1727204202.61193: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41445 1727204202.61220: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41445 1727204202.61241: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41445 1727204202.61299: variable 'network_provider' from source: set_fact 41445 1727204202.61391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204202.61424: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204202.61442: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204202.61468: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204202.61482: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204202.61536: variable 'omit' from source: magic vars 41445 1727204202.61616: variable 'omit' from source: magic vars 41445 1727204202.61686: variable 'network_connections' from source: task vars 41445 1727204202.61698: variable 'interface' from source: set_fact 41445 1727204202.61745: variable 'interface' from source: set_fact 41445 1727204202.61751: variable 'interface' from source: set_fact 41445 1727204202.61796: variable 'interface' from source: set_fact 41445 1727204202.61934: variable 'omit' from source: magic vars 41445 1727204202.61942: variable '__lsr_ansible_managed' from source: task vars 41445 1727204202.61984: variable '__lsr_ansible_managed' from source: task vars 41445 1727204202.62169: Loaded config def from plugin (lookup/template) 41445 1727204202.62172: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 41445 1727204202.62195: File lookup term: get_ansible_managed.j2 41445 1727204202.62199: variable 'ansible_search_path' from source: unknown 41445 1727204202.62202: evaluation_path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 41445 1727204202.62213: search_path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 41445 1727204202.62229: variable 'ansible_search_path' from source: unknown 41445 1727204202.65602: variable 'ansible_managed' from source: unknown 41445 1727204202.65672: variable 'omit' from source: magic vars 41445 1727204202.65696: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41445 1727204202.65721: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41445 1727204202.65734: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41445 1727204202.65747: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204202.65755: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204202.65778: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41445 1727204202.65782: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204202.65784: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204202.65850: Set connection var ansible_shell_executable to /bin/sh 41445 1727204202.65853: Set connection var ansible_shell_type to sh 41445 1727204202.65856: Set connection var ansible_pipelining to False 41445 1727204202.65863: Set connection var ansible_timeout to 10 41445 1727204202.65866: Set connection var ansible_connection to ssh 41445 1727204202.65872: Set connection var ansible_module_compression to ZIP_DEFLATED 41445 1727204202.65890: variable 'ansible_shell_executable' from source: unknown 41445 1727204202.65893: variable 'ansible_connection' from source: unknown 41445 1727204202.65895: variable 'ansible_module_compression' from source: unknown 41445 1727204202.65898: variable 'ansible_shell_type' from source: unknown 41445 1727204202.65900: variable 'ansible_shell_executable' from source: unknown 41445 1727204202.65903: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204202.65906: variable 'ansible_pipelining' from source: unknown 41445 1727204202.65909: variable 'ansible_timeout' from source: unknown 41445 1727204202.65921: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204202.66005: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 41445 1727204202.66017: variable 'omit' from source: magic vars 41445 1727204202.66024: starting attempt loop 41445 1727204202.66027: running the handler 41445 1727204202.66038: _low_level_execute_command(): starting 41445 1727204202.66045: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41445 1727204202.66550: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204202.66554: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204202.66557: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204202.66559: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found <<< 41445 1727204202.66561: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204202.66613: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204202.66616: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204202.66618: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204202.66666: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204202.68318: stdout chunk (state=3): >>>/root <<< 41445 1727204202.68420: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204202.68450: stderr chunk (state=3): >>><<< 41445 1727204202.68453: stdout chunk (state=3): >>><<< 41445 1727204202.68471: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204202.68486: _low_level_execute_command(): starting 41445 1727204202.68492: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204202.6847224-42925-146151869730676 `" && echo ansible-tmp-1727204202.6847224-42925-146151869730676="` echo /root/.ansible/tmp/ansible-tmp-1727204202.6847224-42925-146151869730676 `" ) && sleep 0' 41445 1727204202.68933: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204202.68937: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found <<< 41445 1727204202.68939: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204202.68941: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204202.68943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204202.68945: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204202.68993: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204202.68996: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204202.69046: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204202.70873: stdout chunk (state=3): >>>ansible-tmp-1727204202.6847224-42925-146151869730676=/root/.ansible/tmp/ansible-tmp-1727204202.6847224-42925-146151869730676 <<< 41445 1727204202.70978: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204202.71007: stderr chunk (state=3): >>><<< 41445 1727204202.71010: stdout chunk (state=3): >>><<< 41445 1727204202.71027: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204202.6847224-42925-146151869730676=/root/.ansible/tmp/ansible-tmp-1727204202.6847224-42925-146151869730676 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204202.71062: variable 'ansible_module_compression' from source: unknown 41445 1727204202.71100: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-414450s0ylvj1/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 41445 1727204202.71131: variable 'ansible_facts' from source: unknown 41445 1727204202.71196: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204202.6847224-42925-146151869730676/AnsiballZ_network_connections.py 41445 1727204202.71295: Sending initial data 41445 1727204202.71298: Sent initial data (168 bytes) 41445 1727204202.71754: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204202.71762: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found <<< 41445 1727204202.71765: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204202.71767: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204202.71769: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204202.71813: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204202.71816: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204202.71859: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204202.73352: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 41445 1727204202.73355: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41445 1727204202.73384: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41445 1727204202.73415: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-414450s0ylvj1/tmp8qbkdept /root/.ansible/tmp/ansible-tmp-1727204202.6847224-42925-146151869730676/AnsiballZ_network_connections.py <<< 41445 1727204202.73418: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204202.6847224-42925-146151869730676/AnsiballZ_network_connections.py" <<< 41445 1727204202.73444: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-414450s0ylvj1/tmp8qbkdept" to remote "/root/.ansible/tmp/ansible-tmp-1727204202.6847224-42925-146151869730676/AnsiballZ_network_connections.py" <<< 41445 1727204202.73450: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204202.6847224-42925-146151869730676/AnsiballZ_network_connections.py" <<< 41445 1727204202.74127: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204202.74164: stderr chunk (state=3): >>><<< 41445 1727204202.74167: stdout chunk (state=3): >>><<< 41445 1727204202.74184: done transferring module to remote 41445 1727204202.74193: _low_level_execute_command(): starting 41445 1727204202.74197: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204202.6847224-42925-146151869730676/ /root/.ansible/tmp/ansible-tmp-1727204202.6847224-42925-146151869730676/AnsiballZ_network_connections.py && sleep 0' 41445 1727204202.74624: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204202.74627: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found <<< 41445 1727204202.74629: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204202.74631: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204202.74633: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204202.74680: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204202.74683: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204202.74725: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204202.76416: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204202.76439: stderr chunk (state=3): >>><<< 41445 1727204202.76442: stdout chunk (state=3): >>><<< 41445 1727204202.76455: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204202.76458: _low_level_execute_command(): starting 41445 1727204202.76462: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204202.6847224-42925-146151869730676/AnsiballZ_network_connections.py && sleep 0' 41445 1727204202.76880: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204202.76884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204202.76886: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204202.76888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204202.76938: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204202.76942: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204202.76987: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204203.04400: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'ethtest0': update connection ethtest0, d37af1d3-4475-460d-968a-fd721e68b223\n[004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, d37af1d3-4475-460d-968a-fd721e68b223 (is-modified)\n[005] #0, state:up persistent_state:present, 'ethtest0': connection reapplied\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"dhcp4": false, "address": ["198.51.100.3/26"], "route": [{"network": "198.51.100.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2, "table": "custom"}, {"network": "198.51.100.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4, "table": "custom"}, {"network": "192.0.2.64", "prefix": 26, "gateway": "198.51.100.8", "metric": 50, "table": "custom", "src": "198.51.100.3"}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"dhcp4": false, "address": ["198.51.100.3/26"], "route": [{"network": "198.51.100.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2, "table": "custom"}, {"network": "198.51.100.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4, "table": "custom"}, {"network": "192.0.2.64", "prefix": 26, "gateway": "198.51.100.8", "metric": 50, "table": "custom", "src": "198.51.100.3"}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 41445 1727204203.06332: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. <<< 41445 1727204203.06336: stdout chunk (state=3): >>><<< 41445 1727204203.06338: stderr chunk (state=3): >>><<< 41445 1727204203.06340: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'ethtest0': update connection ethtest0, d37af1d3-4475-460d-968a-fd721e68b223\n[004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, d37af1d3-4475-460d-968a-fd721e68b223 (is-modified)\n[005] #0, state:up persistent_state:present, 'ethtest0': connection reapplied\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"dhcp4": false, "address": ["198.51.100.3/26"], "route": [{"network": "198.51.100.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2, "table": "custom"}, {"network": "198.51.100.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4, "table": "custom"}, {"network": "192.0.2.64", "prefix": 26, "gateway": "198.51.100.8", "metric": 50, "table": "custom", "src": "198.51.100.3"}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"dhcp4": false, "address": ["198.51.100.3/26"], "route": [{"network": "198.51.100.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2, "table": "custom"}, {"network": "198.51.100.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4, "table": "custom"}, {"network": "192.0.2.64", "prefix": 26, "gateway": "198.51.100.8", "metric": 50, "table": "custom", "src": "198.51.100.3"}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. 41445 1727204203.06343: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'ethtest0', 'interface_name': 'ethtest0', 'state': 'up', 'type': 'ethernet', 'autoconnect': True, 'ip': {'dhcp4': False, 'address': ['198.51.100.3/26'], 'route': [{'network': '198.51.100.128', 'prefix': 26, 'gateway': '198.51.100.1', 'metric': 2, 'table': 'custom'}, {'network': '198.51.100.64', 'prefix': 26, 'gateway': '198.51.100.6', 'metric': 4, 'table': 'custom'}, {'network': '192.0.2.64', 'prefix': 26, 'gateway': '198.51.100.8', 'metric': 50, 'table': 'custom', 'src': '198.51.100.3'}]}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204202.6847224-42925-146151869730676/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41445 1727204203.06346: _low_level_execute_command(): starting 41445 1727204203.06348: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204202.6847224-42925-146151869730676/ > /dev/null 2>&1 && sleep 0' 41445 1727204203.07405: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204203.07591: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204203.07831: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204203.07863: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204203.09785: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204203.09789: stdout chunk (state=3): >>><<< 41445 1727204203.09792: stderr chunk (state=3): >>><<< 41445 1727204203.09794: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204203.09796: handler run complete 41445 1727204203.09835: attempt loop complete, returning result 41445 1727204203.09843: _execute() done 41445 1727204203.09848: dumping result to json 41445 1727204203.09858: done dumping result, returning 41445 1727204203.09870: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [028d2410-947f-bf02-eee4-000000000077] 41445 1727204203.09925: sending task result for task 028d2410-947f-bf02-eee4-000000000077 changed: [managed-node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": true, "interface_name": "ethtest0", "ip": { "address": [ "198.51.100.3/26" ], "dhcp4": false, "route": [ { "gateway": "198.51.100.1", "metric": 2, "network": "198.51.100.128", "prefix": 26, "table": "custom" }, { "gateway": "198.51.100.6", "metric": 4, "network": "198.51.100.64", "prefix": 26, "table": "custom" }, { "gateway": "198.51.100.8", "metric": 50, "network": "192.0.2.64", "prefix": 26, "src": "198.51.100.3", "table": "custom" } ] }, "name": "ethtest0", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [003] #0, state:up persistent_state:present, 'ethtest0': update connection ethtest0, d37af1d3-4475-460d-968a-fd721e68b223 [004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, d37af1d3-4475-460d-968a-fd721e68b223 (is-modified) [005] #0, state:up persistent_state:present, 'ethtest0': connection reapplied 41445 1727204203.10464: no more pending results, returning what we have 41445 1727204203.10468: results queue empty 41445 1727204203.10469: checking for any_errors_fatal 41445 1727204203.10481: done checking for any_errors_fatal 41445 1727204203.10482: checking for max_fail_percentage 41445 1727204203.10484: done checking for max_fail_percentage 41445 1727204203.10485: checking to see if all hosts have failed and the running result is not ok 41445 1727204203.10486: done checking to see if all hosts have failed 41445 1727204203.10487: getting the remaining hosts for this loop 41445 1727204203.10488: done getting the remaining hosts for this loop 41445 1727204203.10492: getting the next task for host managed-node3 41445 1727204203.10500: done getting next task for host managed-node3 41445 1727204203.10503: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 41445 1727204203.10506: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204203.10515: done sending task result for task 028d2410-947f-bf02-eee4-000000000077 41445 1727204203.10519: WORKER PROCESS EXITING 41445 1727204203.10527: getting variables 41445 1727204203.10529: in VariableManager get_vars() 41445 1727204203.10571: Calling all_inventory to load vars for managed-node3 41445 1727204203.10575: Calling groups_inventory to load vars for managed-node3 41445 1727204203.10580: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204203.10595: Calling all_plugins_play to load vars for managed-node3 41445 1727204203.10599: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204203.10602: Calling groups_plugins_play to load vars for managed-node3 41445 1727204203.13297: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204203.17311: done with get_vars() 41445 1727204203.17341: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:56:43 -0400 (0:00:00.588) 0:00:21.962 ***** 41445 1727204203.17437: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_state 41445 1727204203.18388: worker is 1 (out of 1 available) 41445 1727204203.18397: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_state 41445 1727204203.18407: done queuing things up, now waiting for results queue to drain 41445 1727204203.18408: waiting for pending results... 41445 1727204203.18991: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state 41445 1727204203.18996: in run() - task 028d2410-947f-bf02-eee4-000000000078 41445 1727204203.19000: variable 'ansible_search_path' from source: unknown 41445 1727204203.19002: variable 'ansible_search_path' from source: unknown 41445 1727204203.19213: calling self._execute() 41445 1727204203.19313: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204203.19324: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204203.19339: variable 'omit' from source: magic vars 41445 1727204203.20144: variable 'ansible_distribution_major_version' from source: facts 41445 1727204203.20163: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204203.20278: variable 'network_state' from source: role '' defaults 41445 1727204203.20681: Evaluated conditional (network_state != {}): False 41445 1727204203.20684: when evaluation is False, skipping this task 41445 1727204203.20687: _execute() done 41445 1727204203.20690: dumping result to json 41445 1727204203.20692: done dumping result, returning 41445 1727204203.20695: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state [028d2410-947f-bf02-eee4-000000000078] 41445 1727204203.20697: sending task result for task 028d2410-947f-bf02-eee4-000000000078 skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 41445 1727204203.20834: no more pending results, returning what we have 41445 1727204203.20839: results queue empty 41445 1727204203.20840: checking for any_errors_fatal 41445 1727204203.20852: done checking for any_errors_fatal 41445 1727204203.20853: checking for max_fail_percentage 41445 1727204203.20856: done checking for max_fail_percentage 41445 1727204203.20857: checking to see if all hosts have failed and the running result is not ok 41445 1727204203.20858: done checking to see if all hosts have failed 41445 1727204203.20858: getting the remaining hosts for this loop 41445 1727204203.20860: done getting the remaining hosts for this loop 41445 1727204203.20864: getting the next task for host managed-node3 41445 1727204203.20871: done getting next task for host managed-node3 41445 1727204203.20878: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 41445 1727204203.20881: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204203.20908: getting variables 41445 1727204203.20910: in VariableManager get_vars() 41445 1727204203.20952: Calling all_inventory to load vars for managed-node3 41445 1727204203.20955: Calling groups_inventory to load vars for managed-node3 41445 1727204203.20957: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204203.20969: Calling all_plugins_play to load vars for managed-node3 41445 1727204203.20973: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204203.20979: done sending task result for task 028d2410-947f-bf02-eee4-000000000078 41445 1727204203.20983: WORKER PROCESS EXITING 41445 1727204203.21081: Calling groups_plugins_play to load vars for managed-node3 41445 1727204203.24143: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204203.27400: done with get_vars() 41445 1727204203.27546: done getting variables 41445 1727204203.27664: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:56:43 -0400 (0:00:00.103) 0:00:22.065 ***** 41445 1727204203.27794: entering _queue_task() for managed-node3/debug 41445 1727204203.28550: worker is 1 (out of 1 available) 41445 1727204203.28564: exiting _queue_task() for managed-node3/debug 41445 1727204203.28580: done queuing things up, now waiting for results queue to drain 41445 1727204203.28582: waiting for pending results... 41445 1727204203.29140: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 41445 1727204203.29450: in run() - task 028d2410-947f-bf02-eee4-000000000079 41445 1727204203.29473: variable 'ansible_search_path' from source: unknown 41445 1727204203.29498: variable 'ansible_search_path' from source: unknown 41445 1727204203.29547: calling self._execute() 41445 1727204203.29786: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204203.29973: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204203.29977: variable 'omit' from source: magic vars 41445 1727204203.30651: variable 'ansible_distribution_major_version' from source: facts 41445 1727204203.30670: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204203.30685: variable 'omit' from source: magic vars 41445 1727204203.30807: variable 'omit' from source: magic vars 41445 1727204203.31056: variable 'omit' from source: magic vars 41445 1727204203.31060: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41445 1727204203.31062: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41445 1727204203.31177: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41445 1727204203.31199: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204203.31219: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204203.31252: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41445 1727204203.31260: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204203.31272: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204203.31692: Set connection var ansible_shell_executable to /bin/sh 41445 1727204203.31698: Set connection var ansible_shell_type to sh 41445 1727204203.31700: Set connection var ansible_pipelining to False 41445 1727204203.31703: Set connection var ansible_timeout to 10 41445 1727204203.31704: Set connection var ansible_connection to ssh 41445 1727204203.31706: Set connection var ansible_module_compression to ZIP_DEFLATED 41445 1727204203.31708: variable 'ansible_shell_executable' from source: unknown 41445 1727204203.31710: variable 'ansible_connection' from source: unknown 41445 1727204203.31712: variable 'ansible_module_compression' from source: unknown 41445 1727204203.31714: variable 'ansible_shell_type' from source: unknown 41445 1727204203.31716: variable 'ansible_shell_executable' from source: unknown 41445 1727204203.31717: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204203.31719: variable 'ansible_pipelining' from source: unknown 41445 1727204203.31721: variable 'ansible_timeout' from source: unknown 41445 1727204203.31722: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204203.31903: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41445 1727204203.31992: variable 'omit' from source: magic vars 41445 1727204203.31995: starting attempt loop 41445 1727204203.31998: running the handler 41445 1727204203.32247: variable '__network_connections_result' from source: set_fact 41445 1727204203.32311: handler run complete 41445 1727204203.32330: attempt loop complete, returning result 41445 1727204203.32333: _execute() done 41445 1727204203.32336: dumping result to json 41445 1727204203.32338: done dumping result, returning 41445 1727204203.32463: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [028d2410-947f-bf02-eee4-000000000079] 41445 1727204203.32469: sending task result for task 028d2410-947f-bf02-eee4-000000000079 41445 1727204203.32561: done sending task result for task 028d2410-947f-bf02-eee4-000000000079 41445 1727204203.32564: WORKER PROCESS EXITING ok: [managed-node3] => { "__network_connections_result.stderr_lines": [ "[003] #0, state:up persistent_state:present, 'ethtest0': update connection ethtest0, d37af1d3-4475-460d-968a-fd721e68b223", "[004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, d37af1d3-4475-460d-968a-fd721e68b223 (is-modified)", "[005] #0, state:up persistent_state:present, 'ethtest0': connection reapplied" ] } 41445 1727204203.32640: no more pending results, returning what we have 41445 1727204203.32645: results queue empty 41445 1727204203.32646: checking for any_errors_fatal 41445 1727204203.32654: done checking for any_errors_fatal 41445 1727204203.32655: checking for max_fail_percentage 41445 1727204203.32657: done checking for max_fail_percentage 41445 1727204203.32658: checking to see if all hosts have failed and the running result is not ok 41445 1727204203.32659: done checking to see if all hosts have failed 41445 1727204203.32660: getting the remaining hosts for this loop 41445 1727204203.32661: done getting the remaining hosts for this loop 41445 1727204203.32665: getting the next task for host managed-node3 41445 1727204203.32673: done getting next task for host managed-node3 41445 1727204203.32677: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 41445 1727204203.32680: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204203.32691: getting variables 41445 1727204203.32693: in VariableManager get_vars() 41445 1727204203.32732: Calling all_inventory to load vars for managed-node3 41445 1727204203.32735: Calling groups_inventory to load vars for managed-node3 41445 1727204203.32737: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204203.32746: Calling all_plugins_play to load vars for managed-node3 41445 1727204203.32749: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204203.32751: Calling groups_plugins_play to load vars for managed-node3 41445 1727204203.35372: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204203.37055: done with get_vars() 41445 1727204203.37083: done getting variables 41445 1727204203.37151: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:56:43 -0400 (0:00:00.093) 0:00:22.159 ***** 41445 1727204203.37193: entering _queue_task() for managed-node3/debug 41445 1727204203.37755: worker is 1 (out of 1 available) 41445 1727204203.37767: exiting _queue_task() for managed-node3/debug 41445 1727204203.37787: done queuing things up, now waiting for results queue to drain 41445 1727204203.37788: waiting for pending results... 41445 1727204203.37888: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 41445 1727204203.38108: in run() - task 028d2410-947f-bf02-eee4-00000000007a 41445 1727204203.38112: variable 'ansible_search_path' from source: unknown 41445 1727204203.38117: variable 'ansible_search_path' from source: unknown 41445 1727204203.38123: calling self._execute() 41445 1727204203.38229: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204203.38244: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204203.38261: variable 'omit' from source: magic vars 41445 1727204203.38655: variable 'ansible_distribution_major_version' from source: facts 41445 1727204203.38679: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204203.38690: variable 'omit' from source: magic vars 41445 1727204203.38749: variable 'omit' from source: magic vars 41445 1727204203.38803: variable 'omit' from source: magic vars 41445 1727204203.38868: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41445 1727204203.38899: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41445 1727204203.38925: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41445 1727204203.38979: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204203.38982: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204203.39008: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41445 1727204203.39017: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204203.39024: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204203.39132: Set connection var ansible_shell_executable to /bin/sh 41445 1727204203.39190: Set connection var ansible_shell_type to sh 41445 1727204203.39193: Set connection var ansible_pipelining to False 41445 1727204203.39194: Set connection var ansible_timeout to 10 41445 1727204203.39196: Set connection var ansible_connection to ssh 41445 1727204203.39197: Set connection var ansible_module_compression to ZIP_DEFLATED 41445 1727204203.39199: variable 'ansible_shell_executable' from source: unknown 41445 1727204203.39200: variable 'ansible_connection' from source: unknown 41445 1727204203.39202: variable 'ansible_module_compression' from source: unknown 41445 1727204203.39204: variable 'ansible_shell_type' from source: unknown 41445 1727204203.39205: variable 'ansible_shell_executable' from source: unknown 41445 1727204203.39214: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204203.39220: variable 'ansible_pipelining' from source: unknown 41445 1727204203.39225: variable 'ansible_timeout' from source: unknown 41445 1727204203.39230: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204203.39378: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41445 1727204203.39397: variable 'omit' from source: magic vars 41445 1727204203.39411: starting attempt loop 41445 1727204203.39430: running the handler 41445 1727204203.39517: variable '__network_connections_result' from source: set_fact 41445 1727204203.39567: variable '__network_connections_result' from source: set_fact 41445 1727204203.39763: handler run complete 41445 1727204203.39800: attempt loop complete, returning result 41445 1727204203.39807: _execute() done 41445 1727204203.39814: dumping result to json 41445 1727204203.39822: done dumping result, returning 41445 1727204203.39833: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [028d2410-947f-bf02-eee4-00000000007a] 41445 1727204203.39846: sending task result for task 028d2410-947f-bf02-eee4-00000000007a ok: [managed-node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": true, "interface_name": "ethtest0", "ip": { "address": [ "198.51.100.3/26" ], "dhcp4": false, "route": [ { "gateway": "198.51.100.1", "metric": 2, "network": "198.51.100.128", "prefix": 26, "table": "custom" }, { "gateway": "198.51.100.6", "metric": 4, "network": "198.51.100.64", "prefix": 26, "table": "custom" }, { "gateway": "198.51.100.8", "metric": 50, "network": "192.0.2.64", "prefix": 26, "src": "198.51.100.3", "table": "custom" } ] }, "name": "ethtest0", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[003] #0, state:up persistent_state:present, 'ethtest0': update connection ethtest0, d37af1d3-4475-460d-968a-fd721e68b223\n[004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, d37af1d3-4475-460d-968a-fd721e68b223 (is-modified)\n[005] #0, state:up persistent_state:present, 'ethtest0': connection reapplied\n", "stderr_lines": [ "[003] #0, state:up persistent_state:present, 'ethtest0': update connection ethtest0, d37af1d3-4475-460d-968a-fd721e68b223", "[004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, d37af1d3-4475-460d-968a-fd721e68b223 (is-modified)", "[005] #0, state:up persistent_state:present, 'ethtest0': connection reapplied" ] } } 41445 1727204203.40380: no more pending results, returning what we have 41445 1727204203.40383: results queue empty 41445 1727204203.40384: checking for any_errors_fatal 41445 1727204203.40389: done checking for any_errors_fatal 41445 1727204203.40390: checking for max_fail_percentage 41445 1727204203.40391: done checking for max_fail_percentage 41445 1727204203.40392: checking to see if all hosts have failed and the running result is not ok 41445 1727204203.40393: done checking to see if all hosts have failed 41445 1727204203.40399: getting the remaining hosts for this loop 41445 1727204203.40400: done getting the remaining hosts for this loop 41445 1727204203.40403: getting the next task for host managed-node3 41445 1727204203.40409: done getting next task for host managed-node3 41445 1727204203.40413: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 41445 1727204203.40416: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204203.40427: getting variables 41445 1727204203.40428: in VariableManager get_vars() 41445 1727204203.40465: Calling all_inventory to load vars for managed-node3 41445 1727204203.40468: Calling groups_inventory to load vars for managed-node3 41445 1727204203.40470: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204203.40485: Calling all_plugins_play to load vars for managed-node3 41445 1727204203.40488: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204203.40494: done sending task result for task 028d2410-947f-bf02-eee4-00000000007a 41445 1727204203.40497: WORKER PROCESS EXITING 41445 1727204203.40501: Calling groups_plugins_play to load vars for managed-node3 41445 1727204203.41988: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204203.43582: done with get_vars() 41445 1727204203.43604: done getting variables 41445 1727204203.43664: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:56:43 -0400 (0:00:00.065) 0:00:22.224 ***** 41445 1727204203.43698: entering _queue_task() for managed-node3/debug 41445 1727204203.44173: worker is 1 (out of 1 available) 41445 1727204203.44188: exiting _queue_task() for managed-node3/debug 41445 1727204203.44197: done queuing things up, now waiting for results queue to drain 41445 1727204203.44198: waiting for pending results... 41445 1727204203.44414: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 41445 1727204203.44572: in run() - task 028d2410-947f-bf02-eee4-00000000007b 41445 1727204203.44597: variable 'ansible_search_path' from source: unknown 41445 1727204203.44620: variable 'ansible_search_path' from source: unknown 41445 1727204203.44663: calling self._execute() 41445 1727204203.44768: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204203.44780: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204203.44794: variable 'omit' from source: magic vars 41445 1727204203.45201: variable 'ansible_distribution_major_version' from source: facts 41445 1727204203.45218: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204203.45351: variable 'network_state' from source: role '' defaults 41445 1727204203.45371: Evaluated conditional (network_state != {}): False 41445 1727204203.45382: when evaluation is False, skipping this task 41445 1727204203.45390: _execute() done 41445 1727204203.45409: dumping result to json 41445 1727204203.45421: done dumping result, returning 41445 1727204203.45482: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [028d2410-947f-bf02-eee4-00000000007b] 41445 1727204203.45486: sending task result for task 028d2410-947f-bf02-eee4-00000000007b 41445 1727204203.45556: done sending task result for task 028d2410-947f-bf02-eee4-00000000007b 41445 1727204203.45559: WORKER PROCESS EXITING skipping: [managed-node3] => { "false_condition": "network_state != {}" } 41445 1727204203.45636: no more pending results, returning what we have 41445 1727204203.45640: results queue empty 41445 1727204203.45641: checking for any_errors_fatal 41445 1727204203.45656: done checking for any_errors_fatal 41445 1727204203.45656: checking for max_fail_percentage 41445 1727204203.45659: done checking for max_fail_percentage 41445 1727204203.45660: checking to see if all hosts have failed and the running result is not ok 41445 1727204203.45661: done checking to see if all hosts have failed 41445 1727204203.45661: getting the remaining hosts for this loop 41445 1727204203.45662: done getting the remaining hosts for this loop 41445 1727204203.45667: getting the next task for host managed-node3 41445 1727204203.45674: done getting next task for host managed-node3 41445 1727204203.45680: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 41445 1727204203.45683: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204203.45705: getting variables 41445 1727204203.45707: in VariableManager get_vars() 41445 1727204203.45749: Calling all_inventory to load vars for managed-node3 41445 1727204203.45752: Calling groups_inventory to load vars for managed-node3 41445 1727204203.45755: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204203.45767: Calling all_plugins_play to load vars for managed-node3 41445 1727204203.45771: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204203.45774: Calling groups_plugins_play to load vars for managed-node3 41445 1727204203.48736: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204203.51969: done with get_vars() 41445 1727204203.52004: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:56:43 -0400 (0:00:00.086) 0:00:22.310 ***** 41445 1727204203.52315: entering _queue_task() for managed-node3/ping 41445 1727204203.52870: worker is 1 (out of 1 available) 41445 1727204203.53087: exiting _queue_task() for managed-node3/ping 41445 1727204203.53099: done queuing things up, now waiting for results queue to drain 41445 1727204203.53100: waiting for pending results... 41445 1727204203.53598: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 41445 1727204203.53913: in run() - task 028d2410-947f-bf02-eee4-00000000007c 41445 1727204203.53917: variable 'ansible_search_path' from source: unknown 41445 1727204203.53920: variable 'ansible_search_path' from source: unknown 41445 1727204203.53923: calling self._execute() 41445 1727204203.54071: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204203.54137: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204203.54153: variable 'omit' from source: magic vars 41445 1727204203.54994: variable 'ansible_distribution_major_version' from source: facts 41445 1727204203.55019: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204203.55219: variable 'omit' from source: magic vars 41445 1727204203.55223: variable 'omit' from source: magic vars 41445 1727204203.55381: variable 'omit' from source: magic vars 41445 1727204203.55384: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41445 1727204203.55418: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41445 1727204203.55653: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41445 1727204203.55657: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204203.55659: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204203.55661: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41445 1727204203.55665: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204203.55667: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204203.55865: Set connection var ansible_shell_executable to /bin/sh 41445 1727204203.56081: Set connection var ansible_shell_type to sh 41445 1727204203.56089: Set connection var ansible_pipelining to False 41445 1727204203.56092: Set connection var ansible_timeout to 10 41445 1727204203.56094: Set connection var ansible_connection to ssh 41445 1727204203.56096: Set connection var ansible_module_compression to ZIP_DEFLATED 41445 1727204203.56098: variable 'ansible_shell_executable' from source: unknown 41445 1727204203.56100: variable 'ansible_connection' from source: unknown 41445 1727204203.56102: variable 'ansible_module_compression' from source: unknown 41445 1727204203.56105: variable 'ansible_shell_type' from source: unknown 41445 1727204203.56108: variable 'ansible_shell_executable' from source: unknown 41445 1727204203.56112: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204203.56115: variable 'ansible_pipelining' from source: unknown 41445 1727204203.56117: variable 'ansible_timeout' from source: unknown 41445 1727204203.56120: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204203.56681: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 41445 1727204203.56686: variable 'omit' from source: magic vars 41445 1727204203.56689: starting attempt loop 41445 1727204203.56691: running the handler 41445 1727204203.56693: _low_level_execute_command(): starting 41445 1727204203.56695: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41445 1727204203.58208: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204203.58273: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204203.58419: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204203.58448: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204203.60174: stdout chunk (state=3): >>>/root <<< 41445 1727204203.60234: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204203.60832: stderr chunk (state=3): >>><<< 41445 1727204203.60836: stdout chunk (state=3): >>><<< 41445 1727204203.60839: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204203.60841: _low_level_execute_command(): starting 41445 1727204203.60845: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204203.6072607-42956-163579955409134 `" && echo ansible-tmp-1727204203.6072607-42956-163579955409134="` echo /root/.ansible/tmp/ansible-tmp-1727204203.6072607-42956-163579955409134 `" ) && sleep 0' 41445 1727204203.61863: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204203.61868: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found <<< 41445 1727204203.62022: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204203.62026: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204203.62037: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204203.62226: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204203.62295: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204203.64134: stdout chunk (state=3): >>>ansible-tmp-1727204203.6072607-42956-163579955409134=/root/.ansible/tmp/ansible-tmp-1727204203.6072607-42956-163579955409134 <<< 41445 1727204203.64243: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204203.64279: stderr chunk (state=3): >>><<< 41445 1727204203.64287: stdout chunk (state=3): >>><<< 41445 1727204203.64313: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204203.6072607-42956-163579955409134=/root/.ansible/tmp/ansible-tmp-1727204203.6072607-42956-163579955409134 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204203.64431: variable 'ansible_module_compression' from source: unknown 41445 1727204203.64646: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-414450s0ylvj1/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 41445 1727204203.64649: variable 'ansible_facts' from source: unknown 41445 1727204203.64722: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204203.6072607-42956-163579955409134/AnsiballZ_ping.py 41445 1727204203.65201: Sending initial data 41445 1727204203.65204: Sent initial data (153 bytes) 41445 1727204203.66325: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204203.66339: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration <<< 41445 1727204203.66350: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204203.66747: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204203.66784: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204203.68280: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41445 1727204203.68328: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41445 1727204203.68383: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-414450s0ylvj1/tmplj0br8xb /root/.ansible/tmp/ansible-tmp-1727204203.6072607-42956-163579955409134/AnsiballZ_ping.py <<< 41445 1727204203.68386: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204203.6072607-42956-163579955409134/AnsiballZ_ping.py" <<< 41445 1727204203.68433: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-414450s0ylvj1/tmplj0br8xb" to remote "/root/.ansible/tmp/ansible-tmp-1727204203.6072607-42956-163579955409134/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204203.6072607-42956-163579955409134/AnsiballZ_ping.py" <<< 41445 1727204203.69735: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204203.69780: stderr chunk (state=3): >>><<< 41445 1727204203.69783: stdout chunk (state=3): >>><<< 41445 1727204203.69805: done transferring module to remote 41445 1727204203.69816: _low_level_execute_command(): starting 41445 1727204203.69821: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204203.6072607-42956-163579955409134/ /root/.ansible/tmp/ansible-tmp-1727204203.6072607-42956-163579955409134/AnsiballZ_ping.py && sleep 0' 41445 1727204203.71238: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204203.71242: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204203.71490: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204203.73302: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204203.73306: stderr chunk (state=3): >>><<< 41445 1727204203.73316: stdout chunk (state=3): >>><<< 41445 1727204203.73327: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204203.73330: _low_level_execute_command(): starting 41445 1727204203.73336: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204203.6072607-42956-163579955409134/AnsiballZ_ping.py && sleep 0' 41445 1727204203.74384: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204203.74493: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204203.74503: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204203.74518: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204203.74531: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 <<< 41445 1727204203.74539: stderr chunk (state=3): >>>debug2: match not found <<< 41445 1727204203.74548: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204203.74562: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41445 1727204203.74569: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.22 is address <<< 41445 1727204203.74578: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41445 1727204203.74586: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204203.74596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204203.74788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204203.74797: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204203.74866: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204203.89553: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 41445 1727204203.90654: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204203.90694: stderr chunk (state=3): >>>Shared connection to 10.31.47.22 closed. <<< 41445 1727204203.90740: stderr chunk (state=3): >>><<< 41445 1727204203.90769: stdout chunk (state=3): >>><<< 41445 1727204203.90899: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. 41445 1727204203.90923: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204203.6072607-42956-163579955409134/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41445 1727204203.90933: _low_level_execute_command(): starting 41445 1727204203.90938: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204203.6072607-42956-163579955409134/ > /dev/null 2>&1 && sleep 0' 41445 1727204203.92393: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204203.92691: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204203.92781: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204203.94565: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204203.94639: stderr chunk (state=3): >>><<< 41445 1727204203.94642: stdout chunk (state=3): >>><<< 41445 1727204203.94708: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204203.94719: handler run complete 41445 1727204203.94722: attempt loop complete, returning result 41445 1727204203.94724: _execute() done 41445 1727204203.94726: dumping result to json 41445 1727204203.94728: done dumping result, returning 41445 1727204203.94730: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [028d2410-947f-bf02-eee4-00000000007c] 41445 1727204203.94732: sending task result for task 028d2410-947f-bf02-eee4-00000000007c 41445 1727204203.94815: done sending task result for task 028d2410-947f-bf02-eee4-00000000007c 41445 1727204203.94819: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "ping": "pong" } 41445 1727204203.94948: no more pending results, returning what we have 41445 1727204203.94952: results queue empty 41445 1727204203.94953: checking for any_errors_fatal 41445 1727204203.94962: done checking for any_errors_fatal 41445 1727204203.94962: checking for max_fail_percentage 41445 1727204203.94964: done checking for max_fail_percentage 41445 1727204203.94965: checking to see if all hosts have failed and the running result is not ok 41445 1727204203.94966: done checking to see if all hosts have failed 41445 1727204203.94966: getting the remaining hosts for this loop 41445 1727204203.94968: done getting the remaining hosts for this loop 41445 1727204203.94971: getting the next task for host managed-node3 41445 1727204203.94984: done getting next task for host managed-node3 41445 1727204203.94987: ^ task is: TASK: meta (role_complete) 41445 1727204203.94990: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204203.95002: getting variables 41445 1727204203.95004: in VariableManager get_vars() 41445 1727204203.95051: Calling all_inventory to load vars for managed-node3 41445 1727204203.95054: Calling groups_inventory to load vars for managed-node3 41445 1727204203.95056: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204203.95067: Calling all_plugins_play to load vars for managed-node3 41445 1727204203.95070: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204203.95073: Calling groups_plugins_play to load vars for managed-node3 41445 1727204203.97901: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204203.99630: done with get_vars() 41445 1727204203.99656: done getting variables 41445 1727204203.99760: done queuing things up, now waiting for results queue to drain 41445 1727204203.99762: results queue empty 41445 1727204203.99763: checking for any_errors_fatal 41445 1727204203.99766: done checking for any_errors_fatal 41445 1727204203.99766: checking for max_fail_percentage 41445 1727204203.99767: done checking for max_fail_percentage 41445 1727204203.99768: checking to see if all hosts have failed and the running result is not ok 41445 1727204203.99769: done checking to see if all hosts have failed 41445 1727204203.99770: getting the remaining hosts for this loop 41445 1727204203.99771: done getting the remaining hosts for this loop 41445 1727204203.99773: getting the next task for host managed-node3 41445 1727204203.99779: done getting next task for host managed-node3 41445 1727204203.99782: ^ task is: TASK: Get the routes from the named route table 'custom' 41445 1727204203.99783: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204203.99786: getting variables 41445 1727204203.99787: in VariableManager get_vars() 41445 1727204203.99802: Calling all_inventory to load vars for managed-node3 41445 1727204203.99804: Calling groups_inventory to load vars for managed-node3 41445 1727204203.99806: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204203.99814: Calling all_plugins_play to load vars for managed-node3 41445 1727204203.99817: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204203.99820: Calling groups_plugins_play to load vars for managed-node3 41445 1727204204.01014: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204204.02747: done with get_vars() 41445 1727204204.02772: done getting variables 41445 1727204204.02818: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get the routes from the named route table 'custom'] ********************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:121 Tuesday 24 September 2024 14:56:44 -0400 (0:00:00.505) 0:00:22.816 ***** 41445 1727204204.02844: entering _queue_task() for managed-node3/command 41445 1727204204.03385: worker is 1 (out of 1 available) 41445 1727204204.03395: exiting _queue_task() for managed-node3/command 41445 1727204204.03403: done queuing things up, now waiting for results queue to drain 41445 1727204204.03404: waiting for pending results... 41445 1727204204.03641: running TaskExecutor() for managed-node3/TASK: Get the routes from the named route table 'custom' 41445 1727204204.03646: in run() - task 028d2410-947f-bf02-eee4-0000000000ac 41445 1727204204.03650: variable 'ansible_search_path' from source: unknown 41445 1727204204.03674: calling self._execute() 41445 1727204204.03788: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204204.03847: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204204.03851: variable 'omit' from source: magic vars 41445 1727204204.04239: variable 'ansible_distribution_major_version' from source: facts 41445 1727204204.04257: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204204.04269: variable 'omit' from source: magic vars 41445 1727204204.04298: variable 'omit' from source: magic vars 41445 1727204204.04346: variable 'omit' from source: magic vars 41445 1727204204.04400: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41445 1727204204.04453: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41445 1727204204.04499: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41445 1727204204.04503: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204204.04517: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204204.04551: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41445 1727204204.04565: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204204.04580: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204204.04719: Set connection var ansible_shell_executable to /bin/sh 41445 1727204204.04722: Set connection var ansible_shell_type to sh 41445 1727204204.04725: Set connection var ansible_pipelining to False 41445 1727204204.04728: Set connection var ansible_timeout to 10 41445 1727204204.04730: Set connection var ansible_connection to ssh 41445 1727204204.04744: Set connection var ansible_module_compression to ZIP_DEFLATED 41445 1727204204.04828: variable 'ansible_shell_executable' from source: unknown 41445 1727204204.04831: variable 'ansible_connection' from source: unknown 41445 1727204204.04834: variable 'ansible_module_compression' from source: unknown 41445 1727204204.04837: variable 'ansible_shell_type' from source: unknown 41445 1727204204.04839: variable 'ansible_shell_executable' from source: unknown 41445 1727204204.04841: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204204.04843: variable 'ansible_pipelining' from source: unknown 41445 1727204204.04845: variable 'ansible_timeout' from source: unknown 41445 1727204204.04847: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204204.05056: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41445 1727204204.05060: variable 'omit' from source: magic vars 41445 1727204204.05062: starting attempt loop 41445 1727204204.05070: running the handler 41445 1727204204.05072: _low_level_execute_command(): starting 41445 1727204204.05074: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41445 1727204204.05881: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204204.05903: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204204.05966: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204204.05988: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204204.06017: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204204.06106: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204204.07692: stdout chunk (state=3): >>>/root <<< 41445 1727204204.07850: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204204.08082: stdout chunk (state=3): >>><<< 41445 1727204204.08085: stderr chunk (state=3): >>><<< 41445 1727204204.08087: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204204.08090: _low_level_execute_command(): starting 41445 1727204204.08093: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204204.0799634-42977-105357933815378 `" && echo ansible-tmp-1727204204.0799634-42977-105357933815378="` echo /root/.ansible/tmp/ansible-tmp-1727204204.0799634-42977-105357933815378 `" ) && sleep 0' 41445 1727204204.09381: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204204.09394: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204204.09406: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204204.09465: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204204.09655: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204204.09739: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204204.11574: stdout chunk (state=3): >>>ansible-tmp-1727204204.0799634-42977-105357933815378=/root/.ansible/tmp/ansible-tmp-1727204204.0799634-42977-105357933815378 <<< 41445 1727204204.11682: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204204.11717: stderr chunk (state=3): >>><<< 41445 1727204204.11758: stdout chunk (state=3): >>><<< 41445 1727204204.11784: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204204.0799634-42977-105357933815378=/root/.ansible/tmp/ansible-tmp-1727204204.0799634-42977-105357933815378 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204204.11970: variable 'ansible_module_compression' from source: unknown 41445 1727204204.11973: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-414450s0ylvj1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 41445 1727204204.12101: variable 'ansible_facts' from source: unknown 41445 1727204204.12189: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204204.0799634-42977-105357933815378/AnsiballZ_command.py 41445 1727204204.12718: Sending initial data 41445 1727204204.12723: Sent initial data (156 bytes) 41445 1727204204.14301: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204204.14738: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204204.14778: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204204.16292: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41445 1727204204.16323: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41445 1727204204.16400: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204204.0799634-42977-105357933815378/AnsiballZ_command.py" <<< 41445 1727204204.16781: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-414450s0ylvj1/tmpx5obdxny /root/.ansible/tmp/ansible-tmp-1727204204.0799634-42977-105357933815378/AnsiballZ_command.py <<< 41445 1727204204.16785: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-414450s0ylvj1/tmpx5obdxny" to remote "/root/.ansible/tmp/ansible-tmp-1727204204.0799634-42977-105357933815378/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204204.0799634-42977-105357933815378/AnsiballZ_command.py" <<< 41445 1727204204.17451: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204204.17505: stderr chunk (state=3): >>><<< 41445 1727204204.17517: stdout chunk (state=3): >>><<< 41445 1727204204.17614: done transferring module to remote 41445 1727204204.17673: _low_level_execute_command(): starting 41445 1727204204.17844: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204204.0799634-42977-105357933815378/ /root/.ansible/tmp/ansible-tmp-1727204204.0799634-42977-105357933815378/AnsiballZ_command.py && sleep 0' 41445 1727204204.18906: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 41445 1727204204.18913: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found <<< 41445 1727204204.18916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204204.18959: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204204.19094: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204204.19127: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204204.19189: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204204.21000: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204204.21048: stderr chunk (state=3): >>><<< 41445 1727204204.21057: stdout chunk (state=3): >>><<< 41445 1727204204.21079: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204204.21182: _low_level_execute_command(): starting 41445 1727204204.21186: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204204.0799634-42977-105357933815378/AnsiballZ_command.py && sleep 0' 41445 1727204204.22261: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204204.22275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 41445 1727204204.22354: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found <<< 41445 1727204204.22435: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204204.22515: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204204.22568: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204204.37885: stdout chunk (state=3): >>> {"changed": true, "stdout": "192.0.2.64/26 via 198.51.100.8 dev ethtest0 proto static src 198.51.100.3 metric 50 \n198.51.100.64/26 via 198.51.100.6 dev ethtest0 proto static metric 4 \n198.51.100.128/26 via 198.51.100.1 dev ethtest0 proto static metric 2 ", "stderr": "", "rc": 0, "cmd": ["ip", "route", "show", "table", "custom"], "start": "2024-09-24 14:56:44.374192", "end": "2024-09-24 14:56:44.377668", "delta": "0:00:00.003476", "msg": "", "invocation": {"module_args": {"_raw_params": "ip route show table custom", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 41445 1727204204.39716: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. <<< 41445 1727204204.39720: stdout chunk (state=3): >>><<< 41445 1727204204.39722: stderr chunk (state=3): >>><<< 41445 1727204204.39725: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "192.0.2.64/26 via 198.51.100.8 dev ethtest0 proto static src 198.51.100.3 metric 50 \n198.51.100.64/26 via 198.51.100.6 dev ethtest0 proto static metric 4 \n198.51.100.128/26 via 198.51.100.1 dev ethtest0 proto static metric 2 ", "stderr": "", "rc": 0, "cmd": ["ip", "route", "show", "table", "custom"], "start": "2024-09-24 14:56:44.374192", "end": "2024-09-24 14:56:44.377668", "delta": "0:00:00.003476", "msg": "", "invocation": {"module_args": {"_raw_params": "ip route show table custom", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. 41445 1727204204.39728: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip route show table custom', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204204.0799634-42977-105357933815378/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41445 1727204204.39730: _low_level_execute_command(): starting 41445 1727204204.39732: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204204.0799634-42977-105357933815378/ > /dev/null 2>&1 && sleep 0' 41445 1727204204.40654: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found <<< 41445 1727204204.40688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204204.40783: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204204.40813: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204204.40898: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204204.42730: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204204.42759: stdout chunk (state=3): >>><<< 41445 1727204204.42763: stderr chunk (state=3): >>><<< 41445 1727204204.42882: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204204.42889: handler run complete 41445 1727204204.42892: Evaluated conditional (False): False 41445 1727204204.42894: attempt loop complete, returning result 41445 1727204204.42896: _execute() done 41445 1727204204.42898: dumping result to json 41445 1727204204.42900: done dumping result, returning 41445 1727204204.42902: done running TaskExecutor() for managed-node3/TASK: Get the routes from the named route table 'custom' [028d2410-947f-bf02-eee4-0000000000ac] 41445 1727204204.42904: sending task result for task 028d2410-947f-bf02-eee4-0000000000ac ok: [managed-node3] => { "changed": false, "cmd": [ "ip", "route", "show", "table", "custom" ], "delta": "0:00:00.003476", "end": "2024-09-24 14:56:44.377668", "rc": 0, "start": "2024-09-24 14:56:44.374192" } STDOUT: 192.0.2.64/26 via 198.51.100.8 dev ethtest0 proto static src 198.51.100.3 metric 50 198.51.100.64/26 via 198.51.100.6 dev ethtest0 proto static metric 4 198.51.100.128/26 via 198.51.100.1 dev ethtest0 proto static metric 2 41445 1727204204.43114: no more pending results, returning what we have 41445 1727204204.43119: results queue empty 41445 1727204204.43120: checking for any_errors_fatal 41445 1727204204.43122: done checking for any_errors_fatal 41445 1727204204.43123: checking for max_fail_percentage 41445 1727204204.43125: done checking for max_fail_percentage 41445 1727204204.43127: checking to see if all hosts have failed and the running result is not ok 41445 1727204204.43128: done checking to see if all hosts have failed 41445 1727204204.43128: getting the remaining hosts for this loop 41445 1727204204.43130: done getting the remaining hosts for this loop 41445 1727204204.43133: getting the next task for host managed-node3 41445 1727204204.43142: done getting next task for host managed-node3 41445 1727204204.43145: ^ task is: TASK: Assert that the named route table 'custom' contains the specified route 41445 1727204204.43148: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204204.43152: getting variables 41445 1727204204.43154: in VariableManager get_vars() 41445 1727204204.43200: Calling all_inventory to load vars for managed-node3 41445 1727204204.43204: Calling groups_inventory to load vars for managed-node3 41445 1727204204.43206: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204204.43221: Calling all_plugins_play to load vars for managed-node3 41445 1727204204.43225: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204204.43228: Calling groups_plugins_play to load vars for managed-node3 41445 1727204204.44049: done sending task result for task 028d2410-947f-bf02-eee4-0000000000ac 41445 1727204204.44053: WORKER PROCESS EXITING 41445 1727204204.45244: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204204.47103: done with get_vars() 41445 1727204204.47134: done getting variables 41445 1727204204.47199: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert that the named route table 'custom' contains the specified route] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:127 Tuesday 24 September 2024 14:56:44 -0400 (0:00:00.443) 0:00:23.260 ***** 41445 1727204204.47236: entering _queue_task() for managed-node3/assert 41445 1727204204.47686: worker is 1 (out of 1 available) 41445 1727204204.47700: exiting _queue_task() for managed-node3/assert 41445 1727204204.47712: done queuing things up, now waiting for results queue to drain 41445 1727204204.47714: waiting for pending results... 41445 1727204204.47903: running TaskExecutor() for managed-node3/TASK: Assert that the named route table 'custom' contains the specified route 41445 1727204204.47956: in run() - task 028d2410-947f-bf02-eee4-0000000000ad 41445 1727204204.47979: variable 'ansible_search_path' from source: unknown 41445 1727204204.48008: calling self._execute() 41445 1727204204.48087: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204204.48091: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204204.48100: variable 'omit' from source: magic vars 41445 1727204204.48384: variable 'ansible_distribution_major_version' from source: facts 41445 1727204204.48394: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204204.48399: variable 'omit' from source: magic vars 41445 1727204204.48417: variable 'omit' from source: magic vars 41445 1727204204.48444: variable 'omit' from source: magic vars 41445 1727204204.48482: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41445 1727204204.48507: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41445 1727204204.48523: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41445 1727204204.48536: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204204.48545: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204204.48570: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41445 1727204204.48573: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204204.48577: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204204.48667: Set connection var ansible_shell_executable to /bin/sh 41445 1727204204.48671: Set connection var ansible_shell_type to sh 41445 1727204204.48674: Set connection var ansible_pipelining to False 41445 1727204204.48681: Set connection var ansible_timeout to 10 41445 1727204204.48684: Set connection var ansible_connection to ssh 41445 1727204204.48690: Set connection var ansible_module_compression to ZIP_DEFLATED 41445 1727204204.48715: variable 'ansible_shell_executable' from source: unknown 41445 1727204204.48718: variable 'ansible_connection' from source: unknown 41445 1727204204.48720: variable 'ansible_module_compression' from source: unknown 41445 1727204204.48723: variable 'ansible_shell_type' from source: unknown 41445 1727204204.48725: variable 'ansible_shell_executable' from source: unknown 41445 1727204204.48728: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204204.48730: variable 'ansible_pipelining' from source: unknown 41445 1727204204.48732: variable 'ansible_timeout' from source: unknown 41445 1727204204.48734: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204204.48837: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41445 1727204204.48846: variable 'omit' from source: magic vars 41445 1727204204.48850: starting attempt loop 41445 1727204204.48853: running the handler 41445 1727204204.48964: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41445 1727204204.49129: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41445 1727204204.49161: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41445 1727204204.49215: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41445 1727204204.49242: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41445 1727204204.49301: variable 'route_table_custom' from source: set_fact 41445 1727204204.49323: Evaluated conditional (route_table_custom.stdout is search("198.51.100.128/26 via 198.51.100.1 dev ethtest0 proto static metric 2")): True 41445 1727204204.49430: variable 'route_table_custom' from source: set_fact 41445 1727204204.49480: Evaluated conditional (route_table_custom.stdout is search("198.51.100.64/26 via 198.51.100.6 dev ethtest0 proto static metric 4")): True 41445 1727204204.49683: variable 'route_table_custom' from source: set_fact 41445 1727204204.49687: Evaluated conditional (route_table_custom.stdout is search("192.0.2.64/26 via 198.51.100.8 dev ethtest0 proto static src 198.51.100.3 metric 50")): True 41445 1727204204.49689: handler run complete 41445 1727204204.49691: attempt loop complete, returning result 41445 1727204204.49692: _execute() done 41445 1727204204.49694: dumping result to json 41445 1727204204.49695: done dumping result, returning 41445 1727204204.49697: done running TaskExecutor() for managed-node3/TASK: Assert that the named route table 'custom' contains the specified route [028d2410-947f-bf02-eee4-0000000000ad] 41445 1727204204.49699: sending task result for task 028d2410-947f-bf02-eee4-0000000000ad 41445 1727204204.49754: done sending task result for task 028d2410-947f-bf02-eee4-0000000000ad 41445 1727204204.49756: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 41445 1727204204.49822: no more pending results, returning what we have 41445 1727204204.49826: results queue empty 41445 1727204204.49827: checking for any_errors_fatal 41445 1727204204.49838: done checking for any_errors_fatal 41445 1727204204.49839: checking for max_fail_percentage 41445 1727204204.49841: done checking for max_fail_percentage 41445 1727204204.49842: checking to see if all hosts have failed and the running result is not ok 41445 1727204204.49842: done checking to see if all hosts have failed 41445 1727204204.49843: getting the remaining hosts for this loop 41445 1727204204.49845: done getting the remaining hosts for this loop 41445 1727204204.49848: getting the next task for host managed-node3 41445 1727204204.49856: done getting next task for host managed-node3 41445 1727204204.49859: ^ task is: TASK: Remove the dedicated test file in `/etc/iproute2/rt_tables.d/` 41445 1727204204.49862: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204204.49866: getting variables 41445 1727204204.49868: in VariableManager get_vars() 41445 1727204204.49940: Calling all_inventory to load vars for managed-node3 41445 1727204204.49943: Calling groups_inventory to load vars for managed-node3 41445 1727204204.49946: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204204.49957: Calling all_plugins_play to load vars for managed-node3 41445 1727204204.49960: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204204.49962: Calling groups_plugins_play to load vars for managed-node3 41445 1727204204.51159: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204204.52015: done with get_vars() 41445 1727204204.52031: done getting variables TASK [Remove the dedicated test file in `/etc/iproute2/rt_tables.d/`] ********** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:135 Tuesday 24 September 2024 14:56:44 -0400 (0:00:00.048) 0:00:23.308 ***** 41445 1727204204.52103: entering _queue_task() for managed-node3/file 41445 1727204204.52338: worker is 1 (out of 1 available) 41445 1727204204.52351: exiting _queue_task() for managed-node3/file 41445 1727204204.52362: done queuing things up, now waiting for results queue to drain 41445 1727204204.52363: waiting for pending results... 41445 1727204204.52545: running TaskExecutor() for managed-node3/TASK: Remove the dedicated test file in `/etc/iproute2/rt_tables.d/` 41445 1727204204.52604: in run() - task 028d2410-947f-bf02-eee4-0000000000ae 41445 1727204204.52618: variable 'ansible_search_path' from source: unknown 41445 1727204204.52646: calling self._execute() 41445 1727204204.52731: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204204.52735: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204204.52745: variable 'omit' from source: magic vars 41445 1727204204.53198: variable 'ansible_distribution_major_version' from source: facts 41445 1727204204.53201: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204204.53204: variable 'omit' from source: magic vars 41445 1727204204.53206: variable 'omit' from source: magic vars 41445 1727204204.53209: variable 'omit' from source: magic vars 41445 1727204204.53251: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41445 1727204204.53290: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41445 1727204204.53319: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41445 1727204204.53340: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204204.53354: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204204.53391: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41445 1727204204.53400: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204204.53419: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204204.53535: Set connection var ansible_shell_executable to /bin/sh 41445 1727204204.53543: Set connection var ansible_shell_type to sh 41445 1727204204.53556: Set connection var ansible_pipelining to False 41445 1727204204.53568: Set connection var ansible_timeout to 10 41445 1727204204.53574: Set connection var ansible_connection to ssh 41445 1727204204.53588: Set connection var ansible_module_compression to ZIP_DEFLATED 41445 1727204204.53618: variable 'ansible_shell_executable' from source: unknown 41445 1727204204.53680: variable 'ansible_connection' from source: unknown 41445 1727204204.53683: variable 'ansible_module_compression' from source: unknown 41445 1727204204.53686: variable 'ansible_shell_type' from source: unknown 41445 1727204204.53688: variable 'ansible_shell_executable' from source: unknown 41445 1727204204.53690: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204204.53692: variable 'ansible_pipelining' from source: unknown 41445 1727204204.53694: variable 'ansible_timeout' from source: unknown 41445 1727204204.53696: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204204.53899: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 41445 1727204204.53917: variable 'omit' from source: magic vars 41445 1727204204.53927: starting attempt loop 41445 1727204204.53933: running the handler 41445 1727204204.54067: _low_level_execute_command(): starting 41445 1727204204.54072: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41445 1727204204.54635: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204204.54653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204204.54719: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204204.54745: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204204.54766: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204204.56374: stdout chunk (state=3): >>>/root <<< 41445 1727204204.56484: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204204.56540: stderr chunk (state=3): >>><<< 41445 1727204204.56544: stdout chunk (state=3): >>><<< 41445 1727204204.56555: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204204.56568: _low_level_execute_command(): starting 41445 1727204204.56577: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204204.565566-42999-150745858798794 `" && echo ansible-tmp-1727204204.565566-42999-150745858798794="` echo /root/.ansible/tmp/ansible-tmp-1727204204.565566-42999-150745858798794 `" ) && sleep 0' 41445 1727204204.57294: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204204.57347: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204204.57362: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204204.57389: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204204.57455: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204204.59313: stdout chunk (state=3): >>>ansible-tmp-1727204204.565566-42999-150745858798794=/root/.ansible/tmp/ansible-tmp-1727204204.565566-42999-150745858798794 <<< 41445 1727204204.59452: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204204.59455: stdout chunk (state=3): >>><<< 41445 1727204204.59456: stderr chunk (state=3): >>><<< 41445 1727204204.59468: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204204.565566-42999-150745858798794=/root/.ansible/tmp/ansible-tmp-1727204204.565566-42999-150745858798794 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204204.59516: variable 'ansible_module_compression' from source: unknown 41445 1727204204.59558: ANSIBALLZ: Using lock for file 41445 1727204204.59561: ANSIBALLZ: Acquiring lock 41445 1727204204.59563: ANSIBALLZ: Lock acquired: 140182283769120 41445 1727204204.59565: ANSIBALLZ: Creating module 41445 1727204204.72388: ANSIBALLZ: Writing module into payload 41445 1727204204.72493: ANSIBALLZ: Writing module 41445 1727204204.72512: ANSIBALLZ: Renaming module 41445 1727204204.72516: ANSIBALLZ: Done creating module 41445 1727204204.72529: variable 'ansible_facts' from source: unknown 41445 1727204204.72580: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204204.565566-42999-150745858798794/AnsiballZ_file.py 41445 1727204204.72683: Sending initial data 41445 1727204204.72687: Sent initial data (152 bytes) 41445 1727204204.73147: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204204.73151: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204204.73153: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204204.73156: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204204.73194: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204204.73207: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204204.73255: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204204.74814: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 41445 1727204204.74821: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41445 1727204204.74847: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41445 1727204204.74882: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-414450s0ylvj1/tmpdiiuiisg /root/.ansible/tmp/ansible-tmp-1727204204.565566-42999-150745858798794/AnsiballZ_file.py <<< 41445 1727204204.74893: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204204.565566-42999-150745858798794/AnsiballZ_file.py" <<< 41445 1727204204.74913: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-414450s0ylvj1/tmpdiiuiisg" to remote "/root/.ansible/tmp/ansible-tmp-1727204204.565566-42999-150745858798794/AnsiballZ_file.py" <<< 41445 1727204204.74919: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204204.565566-42999-150745858798794/AnsiballZ_file.py" <<< 41445 1727204204.75438: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204204.75481: stderr chunk (state=3): >>><<< 41445 1727204204.75484: stdout chunk (state=3): >>><<< 41445 1727204204.75524: done transferring module to remote 41445 1727204204.75532: _low_level_execute_command(): starting 41445 1727204204.75537: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204204.565566-42999-150745858798794/ /root/.ansible/tmp/ansible-tmp-1727204204.565566-42999-150745858798794/AnsiballZ_file.py && sleep 0' 41445 1727204204.75950: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204204.75983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 <<< 41445 1727204204.75986: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204204.75989: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204204.75991: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204204.76044: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204204.76051: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204204.76053: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204204.76092: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204204.77799: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204204.77823: stderr chunk (state=3): >>><<< 41445 1727204204.77826: stdout chunk (state=3): >>><<< 41445 1727204204.77841: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204204.77844: _low_level_execute_command(): starting 41445 1727204204.77849: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204204.565566-42999-150745858798794/AnsiballZ_file.py && sleep 0' 41445 1727204204.78253: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204204.78288: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204204.78291: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found <<< 41445 1727204204.78293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204204.78295: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204204.78298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204204.78345: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204204.78353: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204204.78392: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204204.94155: stdout chunk (state=3): >>> {"path": "/etc/iproute2/rt_tables.d/table.conf", "changed": true, "diff": {"before": {"path": "/etc/iproute2/rt_tables.d/table.conf", "state": "file"}, "after": {"path": "/etc/iproute2/rt_tables.d/table.conf", "state": "absent"}}, "state": "absent", "invocation": {"module_args": {"state": "absent", "path": "/etc/iproute2/rt_tables.d/table.conf", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "mode": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 41445 1727204204.95884: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204204.95888: stderr chunk (state=3): >>>Shared connection to 10.31.47.22 closed. <<< 41445 1727204204.95890: stderr chunk (state=3): >>><<< 41445 1727204204.95892: stdout chunk (state=3): >>><<< 41445 1727204204.95895: _low_level_execute_command() done: rc=0, stdout= {"path": "/etc/iproute2/rt_tables.d/table.conf", "changed": true, "diff": {"before": {"path": "/etc/iproute2/rt_tables.d/table.conf", "state": "file"}, "after": {"path": "/etc/iproute2/rt_tables.d/table.conf", "state": "absent"}}, "state": "absent", "invocation": {"module_args": {"state": "absent", "path": "/etc/iproute2/rt_tables.d/table.conf", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "mode": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. 41445 1727204204.95897: done with _execute_module (file, {'state': 'absent', 'path': '/etc/iproute2/rt_tables.d/table.conf', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204204.565566-42999-150745858798794/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41445 1727204204.95900: _low_level_execute_command(): starting 41445 1727204204.95903: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204204.565566-42999-150745858798794/ > /dev/null 2>&1 && sleep 0' 41445 1727204204.96079: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204204.96087: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204204.96098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204204.96115: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204204.96128: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 <<< 41445 1727204204.96136: stderr chunk (state=3): >>>debug2: match not found <<< 41445 1727204204.96150: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204204.96164: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41445 1727204204.96167: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.22 is address <<< 41445 1727204204.96220: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41445 1727204204.96224: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204204.96227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204204.96270: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204204.96280: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 <<< 41445 1727204204.96283: stderr chunk (state=3): >>>debug2: match found <<< 41445 1727204204.96456: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204204.96695: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204204.96757: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204204.98522: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204204.98555: stderr chunk (state=3): >>><<< 41445 1727204204.98581: stdout chunk (state=3): >>><<< 41445 1727204204.98584: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204204.98589: handler run complete 41445 1727204204.98604: attempt loop complete, returning result 41445 1727204204.98607: _execute() done 41445 1727204204.98612: dumping result to json 41445 1727204204.98615: done dumping result, returning 41445 1727204204.98621: done running TaskExecutor() for managed-node3/TASK: Remove the dedicated test file in `/etc/iproute2/rt_tables.d/` [028d2410-947f-bf02-eee4-0000000000ae] 41445 1727204204.98624: sending task result for task 028d2410-947f-bf02-eee4-0000000000ae 41445 1727204204.98721: done sending task result for task 028d2410-947f-bf02-eee4-0000000000ae 41445 1727204204.98724: WORKER PROCESS EXITING changed: [managed-node3] => { "changed": true, "path": "/etc/iproute2/rt_tables.d/table.conf", "state": "absent" } 41445 1727204204.98781: no more pending results, returning what we have 41445 1727204204.98784: results queue empty 41445 1727204204.98785: checking for any_errors_fatal 41445 1727204204.98790: done checking for any_errors_fatal 41445 1727204204.98791: checking for max_fail_percentage 41445 1727204204.98792: done checking for max_fail_percentage 41445 1727204204.98793: checking to see if all hosts have failed and the running result is not ok 41445 1727204204.98794: done checking to see if all hosts have failed 41445 1727204204.98794: getting the remaining hosts for this loop 41445 1727204204.98796: done getting the remaining hosts for this loop 41445 1727204204.98799: getting the next task for host managed-node3 41445 1727204204.98806: done getting next task for host managed-node3 41445 1727204204.98808: ^ task is: TASK: meta (flush_handlers) 41445 1727204204.98812: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204204.98815: getting variables 41445 1727204204.98817: in VariableManager get_vars() 41445 1727204204.98853: Calling all_inventory to load vars for managed-node3 41445 1727204204.98855: Calling groups_inventory to load vars for managed-node3 41445 1727204204.98857: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204204.98867: Calling all_plugins_play to load vars for managed-node3 41445 1727204204.98870: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204204.98872: Calling groups_plugins_play to load vars for managed-node3 41445 1727204205.01135: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204205.02813: done with get_vars() 41445 1727204205.02841: done getting variables 41445 1727204205.02916: in VariableManager get_vars() 41445 1727204205.02932: Calling all_inventory to load vars for managed-node3 41445 1727204205.02935: Calling groups_inventory to load vars for managed-node3 41445 1727204205.02937: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204205.02944: Calling all_plugins_play to load vars for managed-node3 41445 1727204205.02947: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204205.02952: Calling groups_plugins_play to load vars for managed-node3 41445 1727204205.11432: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204205.12954: done with get_vars() 41445 1727204205.12987: done queuing things up, now waiting for results queue to drain 41445 1727204205.12989: results queue empty 41445 1727204205.12990: checking for any_errors_fatal 41445 1727204205.12993: done checking for any_errors_fatal 41445 1727204205.12994: checking for max_fail_percentage 41445 1727204205.12995: done checking for max_fail_percentage 41445 1727204205.12996: checking to see if all hosts have failed and the running result is not ok 41445 1727204205.12996: done checking to see if all hosts have failed 41445 1727204205.12997: getting the remaining hosts for this loop 41445 1727204205.12998: done getting the remaining hosts for this loop 41445 1727204205.13001: getting the next task for host managed-node3 41445 1727204205.13004: done getting next task for host managed-node3 41445 1727204205.13005: ^ task is: TASK: meta (flush_handlers) 41445 1727204205.13006: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204205.13009: getting variables 41445 1727204205.13014: in VariableManager get_vars() 41445 1727204205.13032: Calling all_inventory to load vars for managed-node3 41445 1727204205.13035: Calling groups_inventory to load vars for managed-node3 41445 1727204205.13036: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204205.13063: Calling all_plugins_play to load vars for managed-node3 41445 1727204205.13066: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204205.13070: Calling groups_plugins_play to load vars for managed-node3 41445 1727204205.14417: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204205.16088: done with get_vars() 41445 1727204205.16112: done getting variables 41445 1727204205.16173: in VariableManager get_vars() 41445 1727204205.16189: Calling all_inventory to load vars for managed-node3 41445 1727204205.16192: Calling groups_inventory to load vars for managed-node3 41445 1727204205.16194: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204205.16200: Calling all_plugins_play to load vars for managed-node3 41445 1727204205.16202: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204205.16205: Calling groups_plugins_play to load vars for managed-node3 41445 1727204205.17389: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204205.19389: done with get_vars() 41445 1727204205.19422: done queuing things up, now waiting for results queue to drain 41445 1727204205.19425: results queue empty 41445 1727204205.19425: checking for any_errors_fatal 41445 1727204205.19427: done checking for any_errors_fatal 41445 1727204205.19427: checking for max_fail_percentage 41445 1727204205.19428: done checking for max_fail_percentage 41445 1727204205.19429: checking to see if all hosts have failed and the running result is not ok 41445 1727204205.19430: done checking to see if all hosts have failed 41445 1727204205.19431: getting the remaining hosts for this loop 41445 1727204205.19432: done getting the remaining hosts for this loop 41445 1727204205.19434: getting the next task for host managed-node3 41445 1727204205.19437: done getting next task for host managed-node3 41445 1727204205.19438: ^ task is: None 41445 1727204205.19440: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204205.19441: done queuing things up, now waiting for results queue to drain 41445 1727204205.19442: results queue empty 41445 1727204205.19442: checking for any_errors_fatal 41445 1727204205.19443: done checking for any_errors_fatal 41445 1727204205.19444: checking for max_fail_percentage 41445 1727204205.19444: done checking for max_fail_percentage 41445 1727204205.19445: checking to see if all hosts have failed and the running result is not ok 41445 1727204205.19446: done checking to see if all hosts have failed 41445 1727204205.19448: getting the next task for host managed-node3 41445 1727204205.19450: done getting next task for host managed-node3 41445 1727204205.19450: ^ task is: None 41445 1727204205.19452: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204205.19503: in VariableManager get_vars() 41445 1727204205.19525: done with get_vars() 41445 1727204205.19531: in VariableManager get_vars() 41445 1727204205.19543: done with get_vars() 41445 1727204205.19546: variable 'omit' from source: magic vars 41445 1727204205.19646: variable 'profile' from source: play vars 41445 1727204205.19750: in VariableManager get_vars() 41445 1727204205.19764: done with get_vars() 41445 1727204205.19787: variable 'omit' from source: magic vars 41445 1727204205.19852: variable 'profile' from source: play vars PLAY [Set down {{ profile }}] ************************************************** 41445 1727204205.20535: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 41445 1727204205.20557: getting the remaining hosts for this loop 41445 1727204205.20559: done getting the remaining hosts for this loop 41445 1727204205.20561: getting the next task for host managed-node3 41445 1727204205.20564: done getting next task for host managed-node3 41445 1727204205.20566: ^ task is: TASK: Gathering Facts 41445 1727204205.20567: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204205.20569: getting variables 41445 1727204205.20570: in VariableManager get_vars() 41445 1727204205.20583: Calling all_inventory to load vars for managed-node3 41445 1727204205.20585: Calling groups_inventory to load vars for managed-node3 41445 1727204205.20587: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204205.20593: Calling all_plugins_play to load vars for managed-node3 41445 1727204205.20595: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204205.20597: Calling groups_plugins_play to load vars for managed-node3 41445 1727204205.21766: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204205.23537: done with get_vars() 41445 1727204205.23555: done getting variables 41445 1727204205.23597: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 Tuesday 24 September 2024 14:56:45 -0400 (0:00:00.715) 0:00:24.023 ***** 41445 1727204205.23624: entering _queue_task() for managed-node3/gather_facts 41445 1727204205.23963: worker is 1 (out of 1 available) 41445 1727204205.23977: exiting _queue_task() for managed-node3/gather_facts 41445 1727204205.23990: done queuing things up, now waiting for results queue to drain 41445 1727204205.23992: waiting for pending results... 41445 1727204205.24283: running TaskExecutor() for managed-node3/TASK: Gathering Facts 41445 1727204205.24326: in run() - task 028d2410-947f-bf02-eee4-0000000006a2 41445 1727204205.24347: variable 'ansible_search_path' from source: unknown 41445 1727204205.24393: calling self._execute() 41445 1727204205.24497: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204205.24508: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204205.24523: variable 'omit' from source: magic vars 41445 1727204205.24906: variable 'ansible_distribution_major_version' from source: facts 41445 1727204205.24932: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204205.24981: variable 'omit' from source: magic vars 41445 1727204205.24985: variable 'omit' from source: magic vars 41445 1727204205.25017: variable 'omit' from source: magic vars 41445 1727204205.25066: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41445 1727204205.25111: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41445 1727204205.25140: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41445 1727204205.25180: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204205.25183: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204205.25210: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41445 1727204205.25217: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204205.25243: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204205.25334: Set connection var ansible_shell_executable to /bin/sh 41445 1727204205.25342: Set connection var ansible_shell_type to sh 41445 1727204205.25381: Set connection var ansible_pipelining to False 41445 1727204205.25384: Set connection var ansible_timeout to 10 41445 1727204205.25387: Set connection var ansible_connection to ssh 41445 1727204205.25389: Set connection var ansible_module_compression to ZIP_DEFLATED 41445 1727204205.25412: variable 'ansible_shell_executable' from source: unknown 41445 1727204205.25420: variable 'ansible_connection' from source: unknown 41445 1727204205.25460: variable 'ansible_module_compression' from source: unknown 41445 1727204205.25463: variable 'ansible_shell_type' from source: unknown 41445 1727204205.25465: variable 'ansible_shell_executable' from source: unknown 41445 1727204205.25467: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204205.25469: variable 'ansible_pipelining' from source: unknown 41445 1727204205.25472: variable 'ansible_timeout' from source: unknown 41445 1727204205.25474: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204205.25641: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41445 1727204205.25656: variable 'omit' from source: magic vars 41445 1727204205.25679: starting attempt loop 41445 1727204205.25682: running the handler 41445 1727204205.25695: variable 'ansible_facts' from source: unknown 41445 1727204205.25783: _low_level_execute_command(): starting 41445 1727204205.25786: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41445 1727204205.26498: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204205.26564: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204205.26587: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204205.26661: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204205.28298: stdout chunk (state=3): >>>/root <<< 41445 1727204205.28434: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204205.28448: stdout chunk (state=3): >>><<< 41445 1727204205.28469: stderr chunk (state=3): >>><<< 41445 1727204205.28492: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204205.28595: _low_level_execute_command(): starting 41445 1727204205.28600: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204205.2849977-43031-149751714253262 `" && echo ansible-tmp-1727204205.2849977-43031-149751714253262="` echo /root/.ansible/tmp/ansible-tmp-1727204205.2849977-43031-149751714253262 `" ) && sleep 0' 41445 1727204205.29123: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204205.29139: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204205.29151: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204205.29167: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204205.29184: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 <<< 41445 1727204205.29194: stderr chunk (state=3): >>>debug2: match not found <<< 41445 1727204205.29289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204205.29315: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204205.29328: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204205.29395: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204205.31232: stdout chunk (state=3): >>>ansible-tmp-1727204205.2849977-43031-149751714253262=/root/.ansible/tmp/ansible-tmp-1727204205.2849977-43031-149751714253262 <<< 41445 1727204205.31386: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204205.31396: stdout chunk (state=3): >>><<< 41445 1727204205.31407: stderr chunk (state=3): >>><<< 41445 1727204205.31429: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204205.2849977-43031-149751714253262=/root/.ansible/tmp/ansible-tmp-1727204205.2849977-43031-149751714253262 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204205.31586: variable 'ansible_module_compression' from source: unknown 41445 1727204205.31590: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-414450s0ylvj1/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 41445 1727204205.31593: variable 'ansible_facts' from source: unknown 41445 1727204205.31796: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204205.2849977-43031-149751714253262/AnsiballZ_setup.py 41445 1727204205.32047: Sending initial data 41445 1727204205.32051: Sent initial data (154 bytes) 41445 1727204205.32619: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204205.32631: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204205.32646: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204205.32663: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204205.32772: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204205.32804: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204205.32871: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204205.34358: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 41445 1727204205.34390: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41445 1727204205.34413: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41445 1727204205.34484: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-414450s0ylvj1/tmpk1aebl7s /root/.ansible/tmp/ansible-tmp-1727204205.2849977-43031-149751714253262/AnsiballZ_setup.py <<< 41445 1727204205.34487: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204205.2849977-43031-149751714253262/AnsiballZ_setup.py" <<< 41445 1727204205.34519: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-414450s0ylvj1/tmpk1aebl7s" to remote "/root/.ansible/tmp/ansible-tmp-1727204205.2849977-43031-149751714253262/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204205.2849977-43031-149751714253262/AnsiballZ_setup.py" <<< 41445 1727204205.35984: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204205.36063: stderr chunk (state=3): >>><<< 41445 1727204205.36067: stdout chunk (state=3): >>><<< 41445 1727204205.36202: done transferring module to remote 41445 1727204205.36205: _low_level_execute_command(): starting 41445 1727204205.36208: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204205.2849977-43031-149751714253262/ /root/.ansible/tmp/ansible-tmp-1727204205.2849977-43031-149751714253262/AnsiballZ_setup.py && sleep 0' 41445 1727204205.36952: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204205.37030: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204205.37082: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204205.37103: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204205.37178: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204205.38867: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204205.38898: stderr chunk (state=3): >>><<< 41445 1727204205.38902: stdout chunk (state=3): >>><<< 41445 1727204205.38919: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204205.38922: _low_level_execute_command(): starting 41445 1727204205.38925: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204205.2849977-43031-149751714253262/AnsiballZ_setup.py && sleep 0' 41445 1727204205.39582: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204205.39586: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204205.39589: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204205.39650: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204206.06764: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_iscsi_iqn": "", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC83yKiaGAYjfqsqlfpPMYFAYI2IZVpX8WgNfvPCdI/QOKjuzs4b6SbV/Bm0ogtp9Et9STTGoHBvp3tMYQ6i0y/9DHMBxtiHYJ+rIzJ/YNNMqKc+fMqexyOsi7rKKuzfxXrYU8uPBHq5WU3WAZmJxJn+UHZGog8hUnJ8momdJG+aYo9El3Qce4gVdwORcmHZUOa49M8lLCwTovtYArmkGETUVJ+Jk8huVTzYpASWxxcw6zOvUcn52HC6dmNQv/T+k2uW6UW0rybwIrVUlZXRNODrXs8kCGgOx1OI0XYB3FndJOnORF4A9Y6onLo/zUCEaO8Pi19mcfSbo2v+bmotTVk5jcmvR3jhVYJmJE6a+dQpjSZolSqMv8mI9tkztfxM6bJlNSZcTrvZEzu7cbiE38Pp/Ku143n9iGgWNmUQ2FhUDpoWEhoA767VUunE48P8ivpVZ/u5aEyupZSLEuWEvCLmmGPVcf9hVbcXw0n8RFvUwDdD8WARVhN5GJFUVN5JM0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHm1sMDuVWGACN5pHFDkl6tR93F90YCY4cFGcXcCoQnN+oT963FmBwTMMlfDIm4G2OUATCZuz6QFZP9trAaUzXo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIAB71QKijTWbanEvrb0ex0kLr0wX6qyv6naldRWNiIFP", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fibre_channel_wwn": [], "ansible_lsb": {}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_is_chroot": false, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec25272c602494034078bc876e25857f", "ansible_local": {}, "ansible_fips": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "56", "second": "45", "epoch": "1727204205", "epoch_int": "1727204205", "date": "2024-09-24", "time": "14:56:45", "iso8601_micro": "2024-09-24T18:56:45.670003Z", "iso8601": "2024-09-24T18:56:45Z", "iso8601_basic": "20240924T145645670003", "iso8601_basic_short": "20240924T145645", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, <<< 41445 1727204206.06781: stdout chunk (state=3): >>>"releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 53526 10.31.47.22 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 53526 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_interfaces": ["ethtest0", "rpltstbr", "peerethtest0", "lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "02:83:38:1a:ae:4d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.47.22", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::83:38ff:fe1a:ae4d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_ethtest0": {"device": "ethtest0", "macaddress": "5a:c9:79:b9:fb:44", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv4": {"address": "198.51.100.3", "broadcast": "198.51.100.63", "netmask": "255.255.255.192", "network": "198.51.100.0", "prefix": "26"}, "ipv6": [{"address": "fe80::58c9:79ff:feb9:fb44", "<<< 41445 1727204206.06807: stdout chunk (state=3): >>>prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerethtest0": {"device": "peerethtest0", "macaddress": "22:cb:5d:bd:5d:c6", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::20cb:5dff:febd:5dc6", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "26:cf:9a:9b:f7:ee", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": <<< 41445 1727204206.06835: stdout chunk (state=3): >>>"on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.47.22", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:83:38:1a:ae:4d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.47.22", "198.51.100.3", "192.0.2.72"], "ansible_all_ipv6_addresses": ["fe80::83:38ff:fe1a:ae4d", "fe80::58c9:79ff:feb9:fb44", "fe80::20cb:5dff:febd:5dc6"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.47.22", "127.0.0.0/8", "127.0.0.1", "192.0.2.72", "198.51.100.3"], "ipv6": ["::1", "fe80::83:38ff:fe1a:ae4d", "fe80::20cb:5dff:febd:5dc6", "fe80::58c9:79ff:feb9:fb44"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2927, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 604, "free": 2927}, "nocache": {"free": 3267, "used": 264}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec25272c-6024-9403-4078-bc876e25857f", "ansible_product_uuid": "ec25272c-6024-9403-4078-bc876e25857f", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "h<<< 41445 1727204206.06844: stdout chunk (state=3): >>>olders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 782, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261788700672, "block_size": 4096, "block_total": 65519099, "block_available": 63913257, "block_used": 1605842, "inode_total": 131070960, "inode_available": 131027340, "inode_used": 43620, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_apparmor": {"status": "disabled"}, "ansible_loadavg": {"1m": 0.6083984375, "5m": 0.52978515625, "15m": 0.31201171875}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 41445 1727204206.08834: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. <<< 41445 1727204206.08859: stderr chunk (state=3): >>><<< 41445 1727204206.08863: stdout chunk (state=3): >>><<< 41445 1727204206.08905: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_iscsi_iqn": "", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC83yKiaGAYjfqsqlfpPMYFAYI2IZVpX8WgNfvPCdI/QOKjuzs4b6SbV/Bm0ogtp9Et9STTGoHBvp3tMYQ6i0y/9DHMBxtiHYJ+rIzJ/YNNMqKc+fMqexyOsi7rKKuzfxXrYU8uPBHq5WU3WAZmJxJn+UHZGog8hUnJ8momdJG+aYo9El3Qce4gVdwORcmHZUOa49M8lLCwTovtYArmkGETUVJ+Jk8huVTzYpASWxxcw6zOvUcn52HC6dmNQv/T+k2uW6UW0rybwIrVUlZXRNODrXs8kCGgOx1OI0XYB3FndJOnORF4A9Y6onLo/zUCEaO8Pi19mcfSbo2v+bmotTVk5jcmvR3jhVYJmJE6a+dQpjSZolSqMv8mI9tkztfxM6bJlNSZcTrvZEzu7cbiE38Pp/Ku143n9iGgWNmUQ2FhUDpoWEhoA767VUunE48P8ivpVZ/u5aEyupZSLEuWEvCLmmGPVcf9hVbcXw0n8RFvUwDdD8WARVhN5GJFUVN5JM0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHm1sMDuVWGACN5pHFDkl6tR93F90YCY4cFGcXcCoQnN+oT963FmBwTMMlfDIm4G2OUATCZuz6QFZP9trAaUzXo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIAB71QKijTWbanEvrb0ex0kLr0wX6qyv6naldRWNiIFP", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fibre_channel_wwn": [], "ansible_lsb": {}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_is_chroot": false, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec25272c602494034078bc876e25857f", "ansible_local": {}, "ansible_fips": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "56", "second": "45", "epoch": "1727204205", "epoch_int": "1727204205", "date": "2024-09-24", "time": "14:56:45", "iso8601_micro": "2024-09-24T18:56:45.670003Z", "iso8601": "2024-09-24T18:56:45Z", "iso8601_basic": "20240924T145645670003", "iso8601_basic_short": "20240924T145645", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 53526 10.31.47.22 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 53526 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_interfaces": ["ethtest0", "rpltstbr", "peerethtest0", "lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "02:83:38:1a:ae:4d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.47.22", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::83:38ff:fe1a:ae4d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_ethtest0": {"device": "ethtest0", "macaddress": "5a:c9:79:b9:fb:44", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv4": {"address": "198.51.100.3", "broadcast": "198.51.100.63", "netmask": "255.255.255.192", "network": "198.51.100.0", "prefix": "26"}, "ipv6": [{"address": "fe80::58c9:79ff:feb9:fb44", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerethtest0": {"device": "peerethtest0", "macaddress": "22:cb:5d:bd:5d:c6", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::20cb:5dff:febd:5dc6", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "26:cf:9a:9b:f7:ee", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.47.22", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:83:38:1a:ae:4d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.47.22", "198.51.100.3", "192.0.2.72"], "ansible_all_ipv6_addresses": ["fe80::83:38ff:fe1a:ae4d", "fe80::58c9:79ff:feb9:fb44", "fe80::20cb:5dff:febd:5dc6"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.47.22", "127.0.0.0/8", "127.0.0.1", "192.0.2.72", "198.51.100.3"], "ipv6": ["::1", "fe80::83:38ff:fe1a:ae4d", "fe80::20cb:5dff:febd:5dc6", "fe80::58c9:79ff:feb9:fb44"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2927, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 604, "free": 2927}, "nocache": {"free": 3267, "used": 264}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec25272c-6024-9403-4078-bc876e25857f", "ansible_product_uuid": "ec25272c-6024-9403-4078-bc876e25857f", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 782, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261788700672, "block_size": 4096, "block_total": 65519099, "block_available": 63913257, "block_used": 1605842, "inode_total": 131070960, "inode_available": 131027340, "inode_used": 43620, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_apparmor": {"status": "disabled"}, "ansible_loadavg": {"1m": 0.6083984375, "5m": 0.52978515625, "15m": 0.31201171875}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. 41445 1727204206.09226: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204205.2849977-43031-149751714253262/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41445 1727204206.09244: _low_level_execute_command(): starting 41445 1727204206.09249: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204205.2849977-43031-149751714253262/ > /dev/null 2>&1 && sleep 0' 41445 1727204206.09656: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204206.09661: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204206.09673: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204206.09729: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204206.09736: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204206.09770: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204206.11789: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204206.11793: stdout chunk (state=3): >>><<< 41445 1727204206.11796: stderr chunk (state=3): >>><<< 41445 1727204206.11798: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204206.11800: handler run complete 41445 1727204206.11860: variable 'ansible_facts' from source: unknown 41445 1727204206.11984: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204206.12261: variable 'ansible_facts' from source: unknown 41445 1727204206.12330: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204206.12424: attempt loop complete, returning result 41445 1727204206.12427: _execute() done 41445 1727204206.12430: dumping result to json 41445 1727204206.12461: done dumping result, returning 41445 1727204206.12465: done running TaskExecutor() for managed-node3/TASK: Gathering Facts [028d2410-947f-bf02-eee4-0000000006a2] 41445 1727204206.12470: sending task result for task 028d2410-947f-bf02-eee4-0000000006a2 41445 1727204206.12813: done sending task result for task 028d2410-947f-bf02-eee4-0000000006a2 41445 1727204206.12816: WORKER PROCESS EXITING ok: [managed-node3] 41445 1727204206.13090: no more pending results, returning what we have 41445 1727204206.13093: results queue empty 41445 1727204206.13093: checking for any_errors_fatal 41445 1727204206.13094: done checking for any_errors_fatal 41445 1727204206.13094: checking for max_fail_percentage 41445 1727204206.13096: done checking for max_fail_percentage 41445 1727204206.13096: checking to see if all hosts have failed and the running result is not ok 41445 1727204206.13097: done checking to see if all hosts have failed 41445 1727204206.13097: getting the remaining hosts for this loop 41445 1727204206.13098: done getting the remaining hosts for this loop 41445 1727204206.13100: getting the next task for host managed-node3 41445 1727204206.13105: done getting next task for host managed-node3 41445 1727204206.13106: ^ task is: TASK: meta (flush_handlers) 41445 1727204206.13108: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204206.13111: getting variables 41445 1727204206.13112: in VariableManager get_vars() 41445 1727204206.13133: Calling all_inventory to load vars for managed-node3 41445 1727204206.13135: Calling groups_inventory to load vars for managed-node3 41445 1727204206.13136: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204206.13144: Calling all_plugins_play to load vars for managed-node3 41445 1727204206.13146: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204206.13147: Calling groups_plugins_play to load vars for managed-node3 41445 1727204206.13932: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204206.15479: done with get_vars() 41445 1727204206.15501: done getting variables 41445 1727204206.15572: in VariableManager get_vars() 41445 1727204206.15587: Calling all_inventory to load vars for managed-node3 41445 1727204206.15590: Calling groups_inventory to load vars for managed-node3 41445 1727204206.15592: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204206.15597: Calling all_plugins_play to load vars for managed-node3 41445 1727204206.15600: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204206.15602: Calling groups_plugins_play to load vars for managed-node3 41445 1727204206.16359: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204206.17254: done with get_vars() 41445 1727204206.17273: done queuing things up, now waiting for results queue to drain 41445 1727204206.17275: results queue empty 41445 1727204206.17277: checking for any_errors_fatal 41445 1727204206.17280: done checking for any_errors_fatal 41445 1727204206.17280: checking for max_fail_percentage 41445 1727204206.17281: done checking for max_fail_percentage 41445 1727204206.17281: checking to see if all hosts have failed and the running result is not ok 41445 1727204206.17286: done checking to see if all hosts have failed 41445 1727204206.17286: getting the remaining hosts for this loop 41445 1727204206.17287: done getting the remaining hosts for this loop 41445 1727204206.17289: getting the next task for host managed-node3 41445 1727204206.17291: done getting next task for host managed-node3 41445 1727204206.17293: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 41445 1727204206.17294: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204206.17302: getting variables 41445 1727204206.17302: in VariableManager get_vars() 41445 1727204206.17312: Calling all_inventory to load vars for managed-node3 41445 1727204206.17314: Calling groups_inventory to load vars for managed-node3 41445 1727204206.17315: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204206.17318: Calling all_plugins_play to load vars for managed-node3 41445 1727204206.17320: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204206.17321: Calling groups_plugins_play to load vars for managed-node3 41445 1727204206.18172: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204206.19352: done with get_vars() 41445 1727204206.19366: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:56:46 -0400 (0:00:00.957) 0:00:24.982 ***** 41445 1727204206.19424: entering _queue_task() for managed-node3/include_tasks 41445 1727204206.19678: worker is 1 (out of 1 available) 41445 1727204206.19694: exiting _queue_task() for managed-node3/include_tasks 41445 1727204206.19704: done queuing things up, now waiting for results queue to drain 41445 1727204206.19706: waiting for pending results... 41445 1727204206.19880: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 41445 1727204206.19957: in run() - task 028d2410-947f-bf02-eee4-0000000000b7 41445 1727204206.19971: variable 'ansible_search_path' from source: unknown 41445 1727204206.19974: variable 'ansible_search_path' from source: unknown 41445 1727204206.20003: calling self._execute() 41445 1727204206.20085: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204206.20091: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204206.20099: variable 'omit' from source: magic vars 41445 1727204206.20377: variable 'ansible_distribution_major_version' from source: facts 41445 1727204206.20385: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204206.20391: _execute() done 41445 1727204206.20394: dumping result to json 41445 1727204206.20398: done dumping result, returning 41445 1727204206.20405: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [028d2410-947f-bf02-eee4-0000000000b7] 41445 1727204206.20411: sending task result for task 028d2410-947f-bf02-eee4-0000000000b7 41445 1727204206.20497: done sending task result for task 028d2410-947f-bf02-eee4-0000000000b7 41445 1727204206.20499: WORKER PROCESS EXITING 41445 1727204206.20534: no more pending results, returning what we have 41445 1727204206.20538: in VariableManager get_vars() 41445 1727204206.20582: Calling all_inventory to load vars for managed-node3 41445 1727204206.20586: Calling groups_inventory to load vars for managed-node3 41445 1727204206.20588: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204206.20599: Calling all_plugins_play to load vars for managed-node3 41445 1727204206.20602: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204206.20605: Calling groups_plugins_play to load vars for managed-node3 41445 1727204206.21407: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204206.22296: done with get_vars() 41445 1727204206.22313: variable 'ansible_search_path' from source: unknown 41445 1727204206.22314: variable 'ansible_search_path' from source: unknown 41445 1727204206.22333: we have included files to process 41445 1727204206.22334: generating all_blocks data 41445 1727204206.22335: done generating all_blocks data 41445 1727204206.22336: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 41445 1727204206.22336: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 41445 1727204206.22337: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 41445 1727204206.22706: done processing included file 41445 1727204206.22707: iterating over new_blocks loaded from include file 41445 1727204206.22708: in VariableManager get_vars() 41445 1727204206.22721: done with get_vars() 41445 1727204206.22723: filtering new block on tags 41445 1727204206.22733: done filtering new block on tags 41445 1727204206.22734: in VariableManager get_vars() 41445 1727204206.22747: done with get_vars() 41445 1727204206.22749: filtering new block on tags 41445 1727204206.22760: done filtering new block on tags 41445 1727204206.22761: in VariableManager get_vars() 41445 1727204206.22772: done with get_vars() 41445 1727204206.22773: filtering new block on tags 41445 1727204206.22784: done filtering new block on tags 41445 1727204206.22786: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node3 41445 1727204206.22789: extending task lists for all hosts with included blocks 41445 1727204206.23045: done extending task lists 41445 1727204206.23046: done processing included files 41445 1727204206.23046: results queue empty 41445 1727204206.23047: checking for any_errors_fatal 41445 1727204206.23047: done checking for any_errors_fatal 41445 1727204206.23048: checking for max_fail_percentage 41445 1727204206.23049: done checking for max_fail_percentage 41445 1727204206.23049: checking to see if all hosts have failed and the running result is not ok 41445 1727204206.23050: done checking to see if all hosts have failed 41445 1727204206.23050: getting the remaining hosts for this loop 41445 1727204206.23051: done getting the remaining hosts for this loop 41445 1727204206.23053: getting the next task for host managed-node3 41445 1727204206.23055: done getting next task for host managed-node3 41445 1727204206.23057: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 41445 1727204206.23058: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204206.23064: getting variables 41445 1727204206.23065: in VariableManager get_vars() 41445 1727204206.23073: Calling all_inventory to load vars for managed-node3 41445 1727204206.23075: Calling groups_inventory to load vars for managed-node3 41445 1727204206.23078: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204206.23081: Calling all_plugins_play to load vars for managed-node3 41445 1727204206.23083: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204206.23084: Calling groups_plugins_play to load vars for managed-node3 41445 1727204206.23800: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204206.24675: done with get_vars() 41445 1727204206.24693: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:56:46 -0400 (0:00:00.053) 0:00:25.035 ***** 41445 1727204206.24768: entering _queue_task() for managed-node3/setup 41445 1727204206.25101: worker is 1 (out of 1 available) 41445 1727204206.25114: exiting _queue_task() for managed-node3/setup 41445 1727204206.25126: done queuing things up, now waiting for results queue to drain 41445 1727204206.25128: waiting for pending results... 41445 1727204206.25504: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 41445 1727204206.25550: in run() - task 028d2410-947f-bf02-eee4-0000000006e3 41445 1727204206.25570: variable 'ansible_search_path' from source: unknown 41445 1727204206.25580: variable 'ansible_search_path' from source: unknown 41445 1727204206.25624: calling self._execute() 41445 1727204206.25721: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204206.25781: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204206.25785: variable 'omit' from source: magic vars 41445 1727204206.26125: variable 'ansible_distribution_major_version' from source: facts 41445 1727204206.26171: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204206.26300: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41445 1727204206.27754: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41445 1727204206.27802: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41445 1727204206.27830: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41445 1727204206.27856: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41445 1727204206.27879: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41445 1727204206.27940: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204206.27960: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204206.28181: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204206.28185: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204206.28187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204206.28190: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204206.28193: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204206.28195: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204206.28197: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204206.28214: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204206.28372: variable '__network_required_facts' from source: role '' defaults 41445 1727204206.28389: variable 'ansible_facts' from source: unknown 41445 1727204206.29113: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 41445 1727204206.29121: when evaluation is False, skipping this task 41445 1727204206.29127: _execute() done 41445 1727204206.29133: dumping result to json 41445 1727204206.29139: done dumping result, returning 41445 1727204206.29149: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [028d2410-947f-bf02-eee4-0000000006e3] 41445 1727204206.29159: sending task result for task 028d2410-947f-bf02-eee4-0000000006e3 41445 1727204206.29256: done sending task result for task 028d2410-947f-bf02-eee4-0000000006e3 41445 1727204206.29263: WORKER PROCESS EXITING skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 41445 1727204206.29312: no more pending results, returning what we have 41445 1727204206.29316: results queue empty 41445 1727204206.29316: checking for any_errors_fatal 41445 1727204206.29318: done checking for any_errors_fatal 41445 1727204206.29319: checking for max_fail_percentage 41445 1727204206.29321: done checking for max_fail_percentage 41445 1727204206.29322: checking to see if all hosts have failed and the running result is not ok 41445 1727204206.29322: done checking to see if all hosts have failed 41445 1727204206.29323: getting the remaining hosts for this loop 41445 1727204206.29324: done getting the remaining hosts for this loop 41445 1727204206.29328: getting the next task for host managed-node3 41445 1727204206.29336: done getting next task for host managed-node3 41445 1727204206.29452: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 41445 1727204206.29455: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204206.29469: getting variables 41445 1727204206.29471: in VariableManager get_vars() 41445 1727204206.29508: Calling all_inventory to load vars for managed-node3 41445 1727204206.29513: Calling groups_inventory to load vars for managed-node3 41445 1727204206.29515: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204206.29524: Calling all_plugins_play to load vars for managed-node3 41445 1727204206.29526: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204206.29529: Calling groups_plugins_play to load vars for managed-node3 41445 1727204206.30959: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204206.32730: done with get_vars() 41445 1727204206.32751: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:56:46 -0400 (0:00:00.080) 0:00:25.116 ***** 41445 1727204206.32847: entering _queue_task() for managed-node3/stat 41445 1727204206.33173: worker is 1 (out of 1 available) 41445 1727204206.33189: exiting _queue_task() for managed-node3/stat 41445 1727204206.33202: done queuing things up, now waiting for results queue to drain 41445 1727204206.33204: waiting for pending results... 41445 1727204206.33600: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 41445 1727204206.33640: in run() - task 028d2410-947f-bf02-eee4-0000000006e5 41445 1727204206.33663: variable 'ansible_search_path' from source: unknown 41445 1727204206.33671: variable 'ansible_search_path' from source: unknown 41445 1727204206.33718: calling self._execute() 41445 1727204206.33830: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204206.33842: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204206.33858: variable 'omit' from source: magic vars 41445 1727204206.34266: variable 'ansible_distribution_major_version' from source: facts 41445 1727204206.34288: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204206.34466: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41445 1727204206.34745: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41445 1727204206.34805: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41445 1727204206.34845: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41445 1727204206.34888: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41445 1727204206.34982: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41445 1727204206.35016: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41445 1727204206.35047: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204206.35081: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41445 1727204206.35178: variable '__network_is_ostree' from source: set_fact 41445 1727204206.35219: Evaluated conditional (not __network_is_ostree is defined): False 41445 1727204206.35223: when evaluation is False, skipping this task 41445 1727204206.35225: _execute() done 41445 1727204206.35227: dumping result to json 41445 1727204206.35228: done dumping result, returning 41445 1727204206.35231: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [028d2410-947f-bf02-eee4-0000000006e5] 41445 1727204206.35234: sending task result for task 028d2410-947f-bf02-eee4-0000000006e5 skipping: [managed-node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 41445 1727204206.35395: no more pending results, returning what we have 41445 1727204206.35400: results queue empty 41445 1727204206.35401: checking for any_errors_fatal 41445 1727204206.35410: done checking for any_errors_fatal 41445 1727204206.35411: checking for max_fail_percentage 41445 1727204206.35412: done checking for max_fail_percentage 41445 1727204206.35413: checking to see if all hosts have failed and the running result is not ok 41445 1727204206.35414: done checking to see if all hosts have failed 41445 1727204206.35415: getting the remaining hosts for this loop 41445 1727204206.35417: done getting the remaining hosts for this loop 41445 1727204206.35421: getting the next task for host managed-node3 41445 1727204206.35429: done getting next task for host managed-node3 41445 1727204206.35433: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 41445 1727204206.35436: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204206.35450: getting variables 41445 1727204206.35451: in VariableManager get_vars() 41445 1727204206.35494: Calling all_inventory to load vars for managed-node3 41445 1727204206.35497: Calling groups_inventory to load vars for managed-node3 41445 1727204206.35499: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204206.35511: Calling all_plugins_play to load vars for managed-node3 41445 1727204206.35516: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204206.35519: Calling groups_plugins_play to load vars for managed-node3 41445 1727204206.36388: done sending task result for task 028d2410-947f-bf02-eee4-0000000006e5 41445 1727204206.36391: WORKER PROCESS EXITING 41445 1727204206.37133: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204206.39314: done with get_vars() 41445 1727204206.39337: done getting variables 41445 1727204206.39399: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:56:46 -0400 (0:00:00.065) 0:00:25.182 ***** 41445 1727204206.39433: entering _queue_task() for managed-node3/set_fact 41445 1727204206.39747: worker is 1 (out of 1 available) 41445 1727204206.39761: exiting _queue_task() for managed-node3/set_fact 41445 1727204206.39774: done queuing things up, now waiting for results queue to drain 41445 1727204206.39778: waiting for pending results... 41445 1727204206.40043: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 41445 1727204206.40173: in run() - task 028d2410-947f-bf02-eee4-0000000006e6 41445 1727204206.40203: variable 'ansible_search_path' from source: unknown 41445 1727204206.40211: variable 'ansible_search_path' from source: unknown 41445 1727204206.40251: calling self._execute() 41445 1727204206.40356: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204206.40367: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204206.40385: variable 'omit' from source: magic vars 41445 1727204206.40788: variable 'ansible_distribution_major_version' from source: facts 41445 1727204206.40805: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204206.40982: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41445 1727204206.41226: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41445 1727204206.41274: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41445 1727204206.41319: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41445 1727204206.41417: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41445 1727204206.41454: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41445 1727204206.41486: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41445 1727204206.41513: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204206.41542: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41445 1727204206.41622: variable '__network_is_ostree' from source: set_fact 41445 1727204206.41637: Evaluated conditional (not __network_is_ostree is defined): False 41445 1727204206.41643: when evaluation is False, skipping this task 41445 1727204206.41648: _execute() done 41445 1727204206.41653: dumping result to json 41445 1727204206.41659: done dumping result, returning 41445 1727204206.41667: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [028d2410-947f-bf02-eee4-0000000006e6] 41445 1727204206.41677: sending task result for task 028d2410-947f-bf02-eee4-0000000006e6 skipping: [managed-node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 41445 1727204206.41811: no more pending results, returning what we have 41445 1727204206.41815: results queue empty 41445 1727204206.41816: checking for any_errors_fatal 41445 1727204206.41822: done checking for any_errors_fatal 41445 1727204206.41823: checking for max_fail_percentage 41445 1727204206.41825: done checking for max_fail_percentage 41445 1727204206.41826: checking to see if all hosts have failed and the running result is not ok 41445 1727204206.41827: done checking to see if all hosts have failed 41445 1727204206.41827: getting the remaining hosts for this loop 41445 1727204206.41829: done getting the remaining hosts for this loop 41445 1727204206.41833: getting the next task for host managed-node3 41445 1727204206.41842: done getting next task for host managed-node3 41445 1727204206.41846: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 41445 1727204206.41849: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204206.41863: getting variables 41445 1727204206.41865: in VariableManager get_vars() 41445 1727204206.41903: Calling all_inventory to load vars for managed-node3 41445 1727204206.41906: Calling groups_inventory to load vars for managed-node3 41445 1727204206.41908: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204206.41918: Calling all_plugins_play to load vars for managed-node3 41445 1727204206.41922: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204206.41924: Calling groups_plugins_play to load vars for managed-node3 41445 1727204206.42768: done sending task result for task 028d2410-947f-bf02-eee4-0000000006e6 41445 1727204206.42771: WORKER PROCESS EXITING 41445 1727204206.43681: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204206.45274: done with get_vars() 41445 1727204206.45297: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:56:46 -0400 (0:00:00.059) 0:00:25.241 ***** 41445 1727204206.45387: entering _queue_task() for managed-node3/service_facts 41445 1727204206.45680: worker is 1 (out of 1 available) 41445 1727204206.45693: exiting _queue_task() for managed-node3/service_facts 41445 1727204206.45705: done queuing things up, now waiting for results queue to drain 41445 1727204206.45706: waiting for pending results... 41445 1727204206.46096: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which services are running 41445 1727204206.46105: in run() - task 028d2410-947f-bf02-eee4-0000000006e8 41445 1727204206.46126: variable 'ansible_search_path' from source: unknown 41445 1727204206.46134: variable 'ansible_search_path' from source: unknown 41445 1727204206.46170: calling self._execute() 41445 1727204206.46271: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204206.46284: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204206.46304: variable 'omit' from source: magic vars 41445 1727204206.46662: variable 'ansible_distribution_major_version' from source: facts 41445 1727204206.46681: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204206.46693: variable 'omit' from source: magic vars 41445 1727204206.46752: variable 'omit' from source: magic vars 41445 1727204206.46790: variable 'omit' from source: magic vars 41445 1727204206.46831: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41445 1727204206.46874: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41445 1727204206.46900: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41445 1727204206.46921: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204206.46936: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204206.46972: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41445 1727204206.46983: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204206.46990: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204206.47093: Set connection var ansible_shell_executable to /bin/sh 41445 1727204206.47101: Set connection var ansible_shell_type to sh 41445 1727204206.47111: Set connection var ansible_pipelining to False 41445 1727204206.47123: Set connection var ansible_timeout to 10 41445 1727204206.47130: Set connection var ansible_connection to ssh 41445 1727204206.47141: Set connection var ansible_module_compression to ZIP_DEFLATED 41445 1727204206.47173: variable 'ansible_shell_executable' from source: unknown 41445 1727204206.47183: variable 'ansible_connection' from source: unknown 41445 1727204206.47190: variable 'ansible_module_compression' from source: unknown 41445 1727204206.47196: variable 'ansible_shell_type' from source: unknown 41445 1727204206.47202: variable 'ansible_shell_executable' from source: unknown 41445 1727204206.47208: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204206.47277: variable 'ansible_pipelining' from source: unknown 41445 1727204206.47280: variable 'ansible_timeout' from source: unknown 41445 1727204206.47283: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204206.47420: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 41445 1727204206.47435: variable 'omit' from source: magic vars 41445 1727204206.47443: starting attempt loop 41445 1727204206.47450: running the handler 41445 1727204206.47466: _low_level_execute_command(): starting 41445 1727204206.47479: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41445 1727204206.48182: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204206.48197: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204206.48213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204206.48263: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204206.48326: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204206.48351: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204206.48399: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204206.48481: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204206.50093: stdout chunk (state=3): >>>/root <<< 41445 1727204206.50244: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204206.50247: stdout chunk (state=3): >>><<< 41445 1727204206.50250: stderr chunk (state=3): >>><<< 41445 1727204206.50266: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204206.50287: _low_level_execute_command(): starting 41445 1727204206.50335: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204206.5027227-43081-85151365751309 `" && echo ansible-tmp-1727204206.5027227-43081-85151365751309="` echo /root/.ansible/tmp/ansible-tmp-1727204206.5027227-43081-85151365751309 `" ) && sleep 0' 41445 1727204206.51423: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204206.51437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204206.51650: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204206.51655: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204206.51690: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204206.51703: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204206.51756: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204206.53656: stdout chunk (state=3): >>>ansible-tmp-1727204206.5027227-43081-85151365751309=/root/.ansible/tmp/ansible-tmp-1727204206.5027227-43081-85151365751309 <<< 41445 1727204206.53757: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204206.54190: stderr chunk (state=3): >>><<< 41445 1727204206.54194: stdout chunk (state=3): >>><<< 41445 1727204206.54201: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204206.5027227-43081-85151365751309=/root/.ansible/tmp/ansible-tmp-1727204206.5027227-43081-85151365751309 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204206.54204: variable 'ansible_module_compression' from source: unknown 41445 1727204206.54206: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-414450s0ylvj1/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 41445 1727204206.54209: variable 'ansible_facts' from source: unknown 41445 1727204206.54424: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204206.5027227-43081-85151365751309/AnsiballZ_service_facts.py 41445 1727204206.54842: Sending initial data 41445 1727204206.54929: Sent initial data (161 bytes) 41445 1727204206.55891: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204206.55919: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204206.55966: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204206.57537: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41445 1727204206.57572: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41445 1727204206.57609: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204206.5027227-43081-85151365751309/AnsiballZ_service_facts.py" <<< 41445 1727204206.57640: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-414450s0ylvj1/tmpm39jfo4c /root/.ansible/tmp/ansible-tmp-1727204206.5027227-43081-85151365751309/AnsiballZ_service_facts.py <<< 41445 1727204206.57658: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-414450s0ylvj1/tmpm39jfo4c" to remote "/root/.ansible/tmp/ansible-tmp-1727204206.5027227-43081-85151365751309/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204206.5027227-43081-85151365751309/AnsiballZ_service_facts.py" <<< 41445 1727204206.58855: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204206.58914: stderr chunk (state=3): >>><<< 41445 1727204206.59080: stdout chunk (state=3): >>><<< 41445 1727204206.59083: done transferring module to remote 41445 1727204206.59086: _low_level_execute_command(): starting 41445 1727204206.59088: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204206.5027227-43081-85151365751309/ /root/.ansible/tmp/ansible-tmp-1727204206.5027227-43081-85151365751309/AnsiballZ_service_facts.py && sleep 0' 41445 1727204206.60306: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204206.60320: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204206.60391: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204206.60625: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204206.60671: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204206.60701: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204206.62465: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204206.62475: stdout chunk (state=3): >>><<< 41445 1727204206.62491: stderr chunk (state=3): >>><<< 41445 1727204206.62510: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204206.62518: _low_level_execute_command(): starting 41445 1727204206.62527: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204206.5027227-43081-85151365751309/AnsiballZ_service_facts.py && sleep 0' 41445 1727204206.63921: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204206.63935: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204206.63946: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204206.64007: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204206.64094: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204206.64189: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204206.64230: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204208.12863: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "st<<< 41445 1727204208.12897: stdout chunk (state=3): >>>opped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.<<< 41445 1727204208.12919: stdout chunk (state=3): >>>service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "st<<< 41445 1727204208.12948: stdout chunk (state=3): >>>opped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-<<< 41445 1727204208.12955: stdout chunk (state=3): >>>boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 41445 1727204208.14384: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. <<< 41445 1727204208.14416: stderr chunk (state=3): >>><<< 41445 1727204208.14419: stdout chunk (state=3): >>><<< 41445 1727204208.14449: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. 41445 1727204208.15114: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204206.5027227-43081-85151365751309/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41445 1727204208.15118: _low_level_execute_command(): starting 41445 1727204208.15120: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204206.5027227-43081-85151365751309/ > /dev/null 2>&1 && sleep 0' 41445 1727204208.15660: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204208.15682: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204208.15698: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204208.15718: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204208.15734: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 <<< 41445 1727204208.15790: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204208.15844: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204208.15864: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204208.15919: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204208.15943: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204208.17724: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204208.17761: stderr chunk (state=3): >>><<< 41445 1727204208.17765: stdout chunk (state=3): >>><<< 41445 1727204208.17783: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204208.17793: handler run complete 41445 1727204208.17953: variable 'ansible_facts' from source: unknown 41445 1727204208.18186: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204208.18559: variable 'ansible_facts' from source: unknown 41445 1727204208.18660: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204208.18969: attempt loop complete, returning result 41445 1727204208.18973: _execute() done 41445 1727204208.18977: dumping result to json 41445 1727204208.18979: done dumping result, returning 41445 1727204208.18982: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which services are running [028d2410-947f-bf02-eee4-0000000006e8] 41445 1727204208.18984: sending task result for task 028d2410-947f-bf02-eee4-0000000006e8 41445 1727204208.19837: done sending task result for task 028d2410-947f-bf02-eee4-0000000006e8 41445 1727204208.19841: WORKER PROCESS EXITING ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 41445 1727204208.19941: no more pending results, returning what we have 41445 1727204208.19944: results queue empty 41445 1727204208.19945: checking for any_errors_fatal 41445 1727204208.19947: done checking for any_errors_fatal 41445 1727204208.19948: checking for max_fail_percentage 41445 1727204208.19949: done checking for max_fail_percentage 41445 1727204208.19950: checking to see if all hosts have failed and the running result is not ok 41445 1727204208.19951: done checking to see if all hosts have failed 41445 1727204208.19951: getting the remaining hosts for this loop 41445 1727204208.19953: done getting the remaining hosts for this loop 41445 1727204208.19956: getting the next task for host managed-node3 41445 1727204208.19961: done getting next task for host managed-node3 41445 1727204208.19964: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 41445 1727204208.19966: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204208.19976: getting variables 41445 1727204208.19978: in VariableManager get_vars() 41445 1727204208.20006: Calling all_inventory to load vars for managed-node3 41445 1727204208.20009: Calling groups_inventory to load vars for managed-node3 41445 1727204208.20011: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204208.20019: Calling all_plugins_play to load vars for managed-node3 41445 1727204208.20022: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204208.20026: Calling groups_plugins_play to load vars for managed-node3 41445 1727204208.20778: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204208.21862: done with get_vars() 41445 1727204208.21925: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:56:48 -0400 (0:00:01.766) 0:00:27.008 ***** 41445 1727204208.22039: entering _queue_task() for managed-node3/package_facts 41445 1727204208.22477: worker is 1 (out of 1 available) 41445 1727204208.22491: exiting _queue_task() for managed-node3/package_facts 41445 1727204208.22509: done queuing things up, now waiting for results queue to drain 41445 1727204208.22510: waiting for pending results... 41445 1727204208.22705: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 41445 1727204208.22784: in run() - task 028d2410-947f-bf02-eee4-0000000006e9 41445 1727204208.22798: variable 'ansible_search_path' from source: unknown 41445 1727204208.22802: variable 'ansible_search_path' from source: unknown 41445 1727204208.22832: calling self._execute() 41445 1727204208.22909: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204208.22916: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204208.22925: variable 'omit' from source: magic vars 41445 1727204208.23207: variable 'ansible_distribution_major_version' from source: facts 41445 1727204208.23221: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204208.23226: variable 'omit' from source: magic vars 41445 1727204208.23261: variable 'omit' from source: magic vars 41445 1727204208.23287: variable 'omit' from source: magic vars 41445 1727204208.23324: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41445 1727204208.23350: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41445 1727204208.23366: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41445 1727204208.23381: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204208.23392: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204208.23419: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41445 1727204208.23423: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204208.23425: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204208.23494: Set connection var ansible_shell_executable to /bin/sh 41445 1727204208.23498: Set connection var ansible_shell_type to sh 41445 1727204208.23500: Set connection var ansible_pipelining to False 41445 1727204208.23509: Set connection var ansible_timeout to 10 41445 1727204208.23512: Set connection var ansible_connection to ssh 41445 1727204208.23521: Set connection var ansible_module_compression to ZIP_DEFLATED 41445 1727204208.23540: variable 'ansible_shell_executable' from source: unknown 41445 1727204208.23543: variable 'ansible_connection' from source: unknown 41445 1727204208.23546: variable 'ansible_module_compression' from source: unknown 41445 1727204208.23548: variable 'ansible_shell_type' from source: unknown 41445 1727204208.23551: variable 'ansible_shell_executable' from source: unknown 41445 1727204208.23554: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204208.23556: variable 'ansible_pipelining' from source: unknown 41445 1727204208.23558: variable 'ansible_timeout' from source: unknown 41445 1727204208.23560: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204208.23709: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 41445 1727204208.23721: variable 'omit' from source: magic vars 41445 1727204208.23724: starting attempt loop 41445 1727204208.23726: running the handler 41445 1727204208.23740: _low_level_execute_command(): starting 41445 1727204208.23749: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41445 1727204208.24257: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204208.24261: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 41445 1727204208.24264: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found <<< 41445 1727204208.24267: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204208.24307: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204208.24323: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204208.24363: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204208.25984: stdout chunk (state=3): >>>/root <<< 41445 1727204208.26080: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204208.26118: stderr chunk (state=3): >>><<< 41445 1727204208.26122: stdout chunk (state=3): >>><<< 41445 1727204208.26143: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204208.26155: _low_level_execute_command(): starting 41445 1727204208.26162: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204208.2614331-43145-247307403851887 `" && echo ansible-tmp-1727204208.2614331-43145-247307403851887="` echo /root/.ansible/tmp/ansible-tmp-1727204208.2614331-43145-247307403851887 `" ) && sleep 0' 41445 1727204208.26615: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204208.26619: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204208.26635: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204208.26693: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204208.26696: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204208.26700: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204208.26735: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204208.28577: stdout chunk (state=3): >>>ansible-tmp-1727204208.2614331-43145-247307403851887=/root/.ansible/tmp/ansible-tmp-1727204208.2614331-43145-247307403851887 <<< 41445 1727204208.28682: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204208.28722: stderr chunk (state=3): >>><<< 41445 1727204208.28725: stdout chunk (state=3): >>><<< 41445 1727204208.28739: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204208.2614331-43145-247307403851887=/root/.ansible/tmp/ansible-tmp-1727204208.2614331-43145-247307403851887 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204208.28781: variable 'ansible_module_compression' from source: unknown 41445 1727204208.28824: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-414450s0ylvj1/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 41445 1727204208.28877: variable 'ansible_facts' from source: unknown 41445 1727204208.28995: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204208.2614331-43145-247307403851887/AnsiballZ_package_facts.py 41445 1727204208.29104: Sending initial data 41445 1727204208.29107: Sent initial data (162 bytes) 41445 1727204208.29566: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204208.29569: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204208.29572: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204208.29574: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204208.29628: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204208.29631: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204208.29669: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204208.31156: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 41445 1727204208.31169: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41445 1727204208.31188: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41445 1727204208.31224: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-414450s0ylvj1/tmpvfviza_m /root/.ansible/tmp/ansible-tmp-1727204208.2614331-43145-247307403851887/AnsiballZ_package_facts.py <<< 41445 1727204208.31230: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204208.2614331-43145-247307403851887/AnsiballZ_package_facts.py" <<< 41445 1727204208.31260: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-414450s0ylvj1/tmpvfviza_m" to remote "/root/.ansible/tmp/ansible-tmp-1727204208.2614331-43145-247307403851887/AnsiballZ_package_facts.py" <<< 41445 1727204208.31262: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204208.2614331-43145-247307403851887/AnsiballZ_package_facts.py" <<< 41445 1727204208.32255: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204208.32302: stderr chunk (state=3): >>><<< 41445 1727204208.32306: stdout chunk (state=3): >>><<< 41445 1727204208.32343: done transferring module to remote 41445 1727204208.32352: _low_level_execute_command(): starting 41445 1727204208.32357: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204208.2614331-43145-247307403851887/ /root/.ansible/tmp/ansible-tmp-1727204208.2614331-43145-247307403851887/AnsiballZ_package_facts.py && sleep 0' 41445 1727204208.32805: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204208.32808: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found <<< 41445 1727204208.32813: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204208.32815: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204208.32817: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204208.32869: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204208.32872: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204208.32887: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204208.32907: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204208.34606: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204208.34634: stderr chunk (state=3): >>><<< 41445 1727204208.34637: stdout chunk (state=3): >>><<< 41445 1727204208.34651: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204208.34654: _low_level_execute_command(): starting 41445 1727204208.34663: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204208.2614331-43145-247307403851887/AnsiballZ_package_facts.py && sleep 0' 41445 1727204208.35092: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204208.35098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204208.35117: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204208.35168: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204208.35177: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204208.35180: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204208.35216: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204208.79269: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 41445 1727204208.79285: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 41445 1727204208.79314: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null,<<< 41445 1727204208.79355: stdout chunk (state=3): >>> "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10",<<< 41445 1727204208.79383: stdout chunk (state=3): >>> "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source":<<< 41445 1727204208.79393: stdout chunk (state=3): >>> "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "<<< 41445 1727204208.79405: stdout chunk (state=3): >>>x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el1<<< 41445 1727204208.79420: stdout chunk (state=3): >>>0", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [<<< 41445 1727204208.79425: stdout chunk (state=3): >>>{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name":<<< 41445 1727204208.79449: stdout chunk (state=3): >>> "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch",<<< 41445 1727204208.79459: stdout chunk (state=3): >>> "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch<<< 41445 1727204208.79483: stdout chunk (state=3): >>>": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 41445 1727204208.81214: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. <<< 41445 1727204208.81248: stderr chunk (state=3): >>><<< 41445 1727204208.81251: stdout chunk (state=3): >>><<< 41445 1727204208.81295: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. 41445 1727204208.82494: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204208.2614331-43145-247307403851887/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41445 1727204208.82513: _low_level_execute_command(): starting 41445 1727204208.82516: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204208.2614331-43145-247307403851887/ > /dev/null 2>&1 && sleep 0' 41445 1727204208.82955: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204208.82962: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204208.82986: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204208.83001: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204208.83003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204208.83052: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204208.83055: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204208.83057: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204208.83100: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204208.84895: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204208.84925: stderr chunk (state=3): >>><<< 41445 1727204208.84929: stdout chunk (state=3): >>><<< 41445 1727204208.84942: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204208.84948: handler run complete 41445 1727204208.85490: variable 'ansible_facts' from source: unknown 41445 1727204208.85754: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204208.86832: variable 'ansible_facts' from source: unknown 41445 1727204208.87069: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204208.87452: attempt loop complete, returning result 41445 1727204208.87461: _execute() done 41445 1727204208.87464: dumping result to json 41445 1727204208.87581: done dumping result, returning 41445 1727204208.87589: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [028d2410-947f-bf02-eee4-0000000006e9] 41445 1727204208.87594: sending task result for task 028d2410-947f-bf02-eee4-0000000006e9 41445 1727204208.89216: done sending task result for task 028d2410-947f-bf02-eee4-0000000006e9 41445 1727204208.89219: WORKER PROCESS EXITING ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 41445 1727204208.89371: no more pending results, returning what we have 41445 1727204208.89374: results queue empty 41445 1727204208.89377: checking for any_errors_fatal 41445 1727204208.89381: done checking for any_errors_fatal 41445 1727204208.89382: checking for max_fail_percentage 41445 1727204208.89384: done checking for max_fail_percentage 41445 1727204208.89385: checking to see if all hosts have failed and the running result is not ok 41445 1727204208.89386: done checking to see if all hosts have failed 41445 1727204208.89386: getting the remaining hosts for this loop 41445 1727204208.89388: done getting the remaining hosts for this loop 41445 1727204208.89391: getting the next task for host managed-node3 41445 1727204208.89398: done getting next task for host managed-node3 41445 1727204208.89401: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 41445 1727204208.89403: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204208.89412: getting variables 41445 1727204208.89413: in VariableManager get_vars() 41445 1727204208.89468: Calling all_inventory to load vars for managed-node3 41445 1727204208.89472: Calling groups_inventory to load vars for managed-node3 41445 1727204208.89474: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204208.89498: Calling all_plugins_play to load vars for managed-node3 41445 1727204208.89508: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204208.89512: Calling groups_plugins_play to load vars for managed-node3 41445 1727204208.90355: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204208.91236: done with get_vars() 41445 1727204208.91252: done getting variables 41445 1727204208.91298: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:56:48 -0400 (0:00:00.692) 0:00:27.700 ***** 41445 1727204208.91322: entering _queue_task() for managed-node3/debug 41445 1727204208.91569: worker is 1 (out of 1 available) 41445 1727204208.91587: exiting _queue_task() for managed-node3/debug 41445 1727204208.91600: done queuing things up, now waiting for results queue to drain 41445 1727204208.91601: waiting for pending results... 41445 1727204208.91995: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider 41445 1727204208.92186: in run() - task 028d2410-947f-bf02-eee4-0000000000b8 41445 1727204208.92190: variable 'ansible_search_path' from source: unknown 41445 1727204208.92193: variable 'ansible_search_path' from source: unknown 41445 1727204208.92196: calling self._execute() 41445 1727204208.92199: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204208.92202: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204208.92205: variable 'omit' from source: magic vars 41445 1727204208.92562: variable 'ansible_distribution_major_version' from source: facts 41445 1727204208.92573: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204208.92580: variable 'omit' from source: magic vars 41445 1727204208.92616: variable 'omit' from source: magic vars 41445 1727204208.92832: variable 'network_provider' from source: set_fact 41445 1727204208.92843: variable 'omit' from source: magic vars 41445 1727204208.92846: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41445 1727204208.92849: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41445 1727204208.92851: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41445 1727204208.92854: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204208.92857: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204208.92952: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41445 1727204208.92955: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204208.92958: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204208.92981: Set connection var ansible_shell_executable to /bin/sh 41445 1727204208.92985: Set connection var ansible_shell_type to sh 41445 1727204208.92991: Set connection var ansible_pipelining to False 41445 1727204208.93001: Set connection var ansible_timeout to 10 41445 1727204208.93004: Set connection var ansible_connection to ssh 41445 1727204208.93010: Set connection var ansible_module_compression to ZIP_DEFLATED 41445 1727204208.93040: variable 'ansible_shell_executable' from source: unknown 41445 1727204208.93043: variable 'ansible_connection' from source: unknown 41445 1727204208.93046: variable 'ansible_module_compression' from source: unknown 41445 1727204208.93049: variable 'ansible_shell_type' from source: unknown 41445 1727204208.93051: variable 'ansible_shell_executable' from source: unknown 41445 1727204208.93080: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204208.93084: variable 'ansible_pipelining' from source: unknown 41445 1727204208.93087: variable 'ansible_timeout' from source: unknown 41445 1727204208.93089: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204208.93197: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41445 1727204208.93279: variable 'omit' from source: magic vars 41445 1727204208.93282: starting attempt loop 41445 1727204208.93285: running the handler 41445 1727204208.93288: handler run complete 41445 1727204208.93290: attempt loop complete, returning result 41445 1727204208.93298: _execute() done 41445 1727204208.93300: dumping result to json 41445 1727204208.93302: done dumping result, returning 41445 1727204208.93304: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider [028d2410-947f-bf02-eee4-0000000000b8] 41445 1727204208.93306: sending task result for task 028d2410-947f-bf02-eee4-0000000000b8 41445 1727204208.93481: done sending task result for task 028d2410-947f-bf02-eee4-0000000000b8 41445 1727204208.93489: WORKER PROCESS EXITING ok: [managed-node3] => {} MSG: Using network provider: nm 41445 1727204208.93543: no more pending results, returning what we have 41445 1727204208.93546: results queue empty 41445 1727204208.93546: checking for any_errors_fatal 41445 1727204208.93553: done checking for any_errors_fatal 41445 1727204208.93554: checking for max_fail_percentage 41445 1727204208.93555: done checking for max_fail_percentage 41445 1727204208.93556: checking to see if all hosts have failed and the running result is not ok 41445 1727204208.93556: done checking to see if all hosts have failed 41445 1727204208.93557: getting the remaining hosts for this loop 41445 1727204208.93558: done getting the remaining hosts for this loop 41445 1727204208.93561: getting the next task for host managed-node3 41445 1727204208.93566: done getting next task for host managed-node3 41445 1727204208.93569: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 41445 1727204208.93571: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204208.93582: getting variables 41445 1727204208.93583: in VariableManager get_vars() 41445 1727204208.93619: Calling all_inventory to load vars for managed-node3 41445 1727204208.93622: Calling groups_inventory to load vars for managed-node3 41445 1727204208.93624: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204208.93632: Calling all_plugins_play to load vars for managed-node3 41445 1727204208.93634: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204208.93637: Calling groups_plugins_play to load vars for managed-node3 41445 1727204208.95069: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204208.96861: done with get_vars() 41445 1727204208.96889: done getting variables 41445 1727204208.96952: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:56:48 -0400 (0:00:00.056) 0:00:27.757 ***** 41445 1727204208.96986: entering _queue_task() for managed-node3/fail 41445 1727204208.97325: worker is 1 (out of 1 available) 41445 1727204208.97337: exiting _queue_task() for managed-node3/fail 41445 1727204208.97349: done queuing things up, now waiting for results queue to drain 41445 1727204208.97350: waiting for pending results... 41445 1727204208.97644: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 41445 1727204208.97770: in run() - task 028d2410-947f-bf02-eee4-0000000000b9 41445 1727204208.97803: variable 'ansible_search_path' from source: unknown 41445 1727204208.97816: variable 'ansible_search_path' from source: unknown 41445 1727204208.97857: calling self._execute() 41445 1727204208.97961: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204208.98020: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204208.98024: variable 'omit' from source: magic vars 41445 1727204208.98373: variable 'ansible_distribution_major_version' from source: facts 41445 1727204208.98443: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204208.98577: variable 'network_state' from source: role '' defaults 41445 1727204208.98593: Evaluated conditional (network_state != {}): False 41445 1727204208.98600: when evaluation is False, skipping this task 41445 1727204208.98606: _execute() done 41445 1727204208.98613: dumping result to json 41445 1727204208.98673: done dumping result, returning 41445 1727204208.98679: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [028d2410-947f-bf02-eee4-0000000000b9] 41445 1727204208.98682: sending task result for task 028d2410-947f-bf02-eee4-0000000000b9 41445 1727204208.98746: done sending task result for task 028d2410-947f-bf02-eee4-0000000000b9 41445 1727204208.98748: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 41445 1727204208.98828: no more pending results, returning what we have 41445 1727204208.98832: results queue empty 41445 1727204208.98833: checking for any_errors_fatal 41445 1727204208.98842: done checking for any_errors_fatal 41445 1727204208.98843: checking for max_fail_percentage 41445 1727204208.98844: done checking for max_fail_percentage 41445 1727204208.98845: checking to see if all hosts have failed and the running result is not ok 41445 1727204208.98846: done checking to see if all hosts have failed 41445 1727204208.98847: getting the remaining hosts for this loop 41445 1727204208.98849: done getting the remaining hosts for this loop 41445 1727204208.98853: getting the next task for host managed-node3 41445 1727204208.98859: done getting next task for host managed-node3 41445 1727204208.98863: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 41445 1727204208.98866: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204208.98883: getting variables 41445 1727204208.98885: in VariableManager get_vars() 41445 1727204208.98922: Calling all_inventory to load vars for managed-node3 41445 1727204208.98925: Calling groups_inventory to load vars for managed-node3 41445 1727204208.98928: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204208.98939: Calling all_plugins_play to load vars for managed-node3 41445 1727204208.98942: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204208.98945: Calling groups_plugins_play to load vars for managed-node3 41445 1727204209.00540: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204209.13239: done with get_vars() 41445 1727204209.13267: done getting variables 41445 1727204209.13320: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:56:49 -0400 (0:00:00.163) 0:00:27.921 ***** 41445 1727204209.13347: entering _queue_task() for managed-node3/fail 41445 1727204209.14108: worker is 1 (out of 1 available) 41445 1727204209.14116: exiting _queue_task() for managed-node3/fail 41445 1727204209.14126: done queuing things up, now waiting for results queue to drain 41445 1727204209.14127: waiting for pending results... 41445 1727204209.14258: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 41445 1727204209.14303: in run() - task 028d2410-947f-bf02-eee4-0000000000ba 41445 1727204209.14452: variable 'ansible_search_path' from source: unknown 41445 1727204209.14461: variable 'ansible_search_path' from source: unknown 41445 1727204209.14505: calling self._execute() 41445 1727204209.14621: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204209.14662: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204209.14666: variable 'omit' from source: magic vars 41445 1727204209.15037: variable 'ansible_distribution_major_version' from source: facts 41445 1727204209.15053: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204209.15193: variable 'network_state' from source: role '' defaults 41445 1727204209.15316: Evaluated conditional (network_state != {}): False 41445 1727204209.15320: when evaluation is False, skipping this task 41445 1727204209.15322: _execute() done 41445 1727204209.15325: dumping result to json 41445 1727204209.15328: done dumping result, returning 41445 1727204209.15331: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [028d2410-947f-bf02-eee4-0000000000ba] 41445 1727204209.15334: sending task result for task 028d2410-947f-bf02-eee4-0000000000ba 41445 1727204209.15403: done sending task result for task 028d2410-947f-bf02-eee4-0000000000ba 41445 1727204209.15406: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 41445 1727204209.15468: no more pending results, returning what we have 41445 1727204209.15473: results queue empty 41445 1727204209.15474: checking for any_errors_fatal 41445 1727204209.15489: done checking for any_errors_fatal 41445 1727204209.15490: checking for max_fail_percentage 41445 1727204209.15492: done checking for max_fail_percentage 41445 1727204209.15493: checking to see if all hosts have failed and the running result is not ok 41445 1727204209.15494: done checking to see if all hosts have failed 41445 1727204209.15494: getting the remaining hosts for this loop 41445 1727204209.15496: done getting the remaining hosts for this loop 41445 1727204209.15500: getting the next task for host managed-node3 41445 1727204209.15507: done getting next task for host managed-node3 41445 1727204209.15511: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 41445 1727204209.15513: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204209.15528: getting variables 41445 1727204209.15530: in VariableManager get_vars() 41445 1727204209.15569: Calling all_inventory to load vars for managed-node3 41445 1727204209.15572: Calling groups_inventory to load vars for managed-node3 41445 1727204209.15575: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204209.15692: Calling all_plugins_play to load vars for managed-node3 41445 1727204209.15696: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204209.15699: Calling groups_plugins_play to load vars for managed-node3 41445 1727204209.17134: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204209.20311: done with get_vars() 41445 1727204209.20339: done getting variables 41445 1727204209.20401: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:56:49 -0400 (0:00:00.070) 0:00:27.992 ***** 41445 1727204209.20432: entering _queue_task() for managed-node3/fail 41445 1727204209.21401: worker is 1 (out of 1 available) 41445 1727204209.21410: exiting _queue_task() for managed-node3/fail 41445 1727204209.21419: done queuing things up, now waiting for results queue to drain 41445 1727204209.21420: waiting for pending results... 41445 1727204209.21487: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 41445 1727204209.21646: in run() - task 028d2410-947f-bf02-eee4-0000000000bb 41445 1727204209.21650: variable 'ansible_search_path' from source: unknown 41445 1727204209.21653: variable 'ansible_search_path' from source: unknown 41445 1727204209.21669: calling self._execute() 41445 1727204209.21778: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204209.21863: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204209.21866: variable 'omit' from source: magic vars 41445 1727204209.22179: variable 'ansible_distribution_major_version' from source: facts 41445 1727204209.22201: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204209.22470: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41445 1727204209.25323: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41445 1727204209.25449: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41445 1727204209.25784: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41445 1727204209.25787: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41445 1727204209.25790: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41445 1727204209.25875: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204209.25932: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204209.26037: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204209.26091: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204209.26133: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204209.26238: variable 'ansible_distribution_major_version' from source: facts 41445 1727204209.26258: Evaluated conditional (ansible_distribution_major_version | int > 9): True 41445 1727204209.26390: variable 'ansible_distribution' from source: facts 41445 1727204209.26399: variable '__network_rh_distros' from source: role '' defaults 41445 1727204209.26412: Evaluated conditional (ansible_distribution in __network_rh_distros): True 41445 1727204209.26680: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204209.26709: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204209.26737: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204209.26788: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204209.26806: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204209.26854: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204209.26887: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204209.26915: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204209.26957: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204209.26982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204209.27090: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204209.27094: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204209.27097: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204209.27121: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204209.27138: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204209.27453: variable 'network_connections' from source: play vars 41445 1727204209.27469: variable 'profile' from source: play vars 41445 1727204209.27548: variable 'profile' from source: play vars 41445 1727204209.27558: variable 'interface' from source: set_fact 41445 1727204209.27614: variable 'interface' from source: set_fact 41445 1727204209.27631: variable 'network_state' from source: role '' defaults 41445 1727204209.27690: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41445 1727204209.28268: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41445 1727204209.28318: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41445 1727204209.28359: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41445 1727204209.28545: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41445 1727204209.28706: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41445 1727204209.28805: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41445 1727204209.28808: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204209.28811: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41445 1727204209.28834: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 41445 1727204209.28841: when evaluation is False, skipping this task 41445 1727204209.28849: _execute() done 41445 1727204209.28919: dumping result to json 41445 1727204209.28928: done dumping result, returning 41445 1727204209.28987: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [028d2410-947f-bf02-eee4-0000000000bb] 41445 1727204209.28990: sending task result for task 028d2410-947f-bf02-eee4-0000000000bb 41445 1727204209.29062: done sending task result for task 028d2410-947f-bf02-eee4-0000000000bb skipping: [managed-node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 41445 1727204209.29115: no more pending results, returning what we have 41445 1727204209.29119: results queue empty 41445 1727204209.29120: checking for any_errors_fatal 41445 1727204209.29125: done checking for any_errors_fatal 41445 1727204209.29126: checking for max_fail_percentage 41445 1727204209.29128: done checking for max_fail_percentage 41445 1727204209.29129: checking to see if all hosts have failed and the running result is not ok 41445 1727204209.29130: done checking to see if all hosts have failed 41445 1727204209.29131: getting the remaining hosts for this loop 41445 1727204209.29132: done getting the remaining hosts for this loop 41445 1727204209.29136: getting the next task for host managed-node3 41445 1727204209.29143: done getting next task for host managed-node3 41445 1727204209.29148: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 41445 1727204209.29150: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204209.29165: getting variables 41445 1727204209.29167: in VariableManager get_vars() 41445 1727204209.29212: Calling all_inventory to load vars for managed-node3 41445 1727204209.29216: Calling groups_inventory to load vars for managed-node3 41445 1727204209.29218: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204209.29231: Calling all_plugins_play to load vars for managed-node3 41445 1727204209.29234: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204209.29237: Calling groups_plugins_play to load vars for managed-node3 41445 1727204209.30383: WORKER PROCESS EXITING 41445 1727204209.32066: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204209.33654: done with get_vars() 41445 1727204209.33705: done getting variables 41445 1727204209.33769: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:56:49 -0400 (0:00:00.133) 0:00:28.125 ***** 41445 1727204209.33802: entering _queue_task() for managed-node3/dnf 41445 1727204209.34136: worker is 1 (out of 1 available) 41445 1727204209.34149: exiting _queue_task() for managed-node3/dnf 41445 1727204209.34161: done queuing things up, now waiting for results queue to drain 41445 1727204209.34162: waiting for pending results... 41445 1727204209.34452: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 41445 1727204209.34575: in run() - task 028d2410-947f-bf02-eee4-0000000000bc 41445 1727204209.34599: variable 'ansible_search_path' from source: unknown 41445 1727204209.34608: variable 'ansible_search_path' from source: unknown 41445 1727204209.34646: calling self._execute() 41445 1727204209.34758: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204209.34768: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204209.34786: variable 'omit' from source: magic vars 41445 1727204209.35168: variable 'ansible_distribution_major_version' from source: facts 41445 1727204209.35190: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204209.35398: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41445 1727204209.38357: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41445 1727204209.38465: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41445 1727204209.38471: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41445 1727204209.38512: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41445 1727204209.38544: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41445 1727204209.38637: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204209.38673: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204209.38710: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204209.38791: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204209.38795: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204209.38899: variable 'ansible_distribution' from source: facts 41445 1727204209.38909: variable 'ansible_distribution_major_version' from source: facts 41445 1727204209.38928: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 41445 1727204209.39048: variable '__network_wireless_connections_defined' from source: role '' defaults 41445 1727204209.39225: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204209.39229: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204209.39244: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204209.39288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204209.39306: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204209.39355: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204209.39384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204209.39412: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204209.39552: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204209.39555: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204209.39558: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204209.39560: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204209.39572: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204209.39616: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204209.39634: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204209.39800: variable 'network_connections' from source: play vars 41445 1727204209.39816: variable 'profile' from source: play vars 41445 1727204209.39893: variable 'profile' from source: play vars 41445 1727204209.39902: variable 'interface' from source: set_fact 41445 1727204209.39962: variable 'interface' from source: set_fact 41445 1727204209.40038: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41445 1727204209.40225: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41445 1727204209.40266: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41445 1727204209.40304: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41445 1727204209.40342: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41445 1727204209.40484: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41445 1727204209.40487: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41445 1727204209.40506: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204209.40539: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41445 1727204209.40588: variable '__network_team_connections_defined' from source: role '' defaults 41445 1727204209.41073: variable 'network_connections' from source: play vars 41445 1727204209.41079: variable 'profile' from source: play vars 41445 1727204209.41281: variable 'profile' from source: play vars 41445 1727204209.41284: variable 'interface' from source: set_fact 41445 1727204209.41288: variable 'interface' from source: set_fact 41445 1727204209.41290: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 41445 1727204209.41292: when evaluation is False, skipping this task 41445 1727204209.41294: _execute() done 41445 1727204209.41408: dumping result to json 41445 1727204209.41411: done dumping result, returning 41445 1727204209.41413: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [028d2410-947f-bf02-eee4-0000000000bc] 41445 1727204209.41416: sending task result for task 028d2410-947f-bf02-eee4-0000000000bc skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 41445 1727204209.41720: no more pending results, returning what we have 41445 1727204209.41724: results queue empty 41445 1727204209.41725: checking for any_errors_fatal 41445 1727204209.41733: done checking for any_errors_fatal 41445 1727204209.41734: checking for max_fail_percentage 41445 1727204209.41735: done checking for max_fail_percentage 41445 1727204209.41736: checking to see if all hosts have failed and the running result is not ok 41445 1727204209.41737: done checking to see if all hosts have failed 41445 1727204209.41738: getting the remaining hosts for this loop 41445 1727204209.41739: done getting the remaining hosts for this loop 41445 1727204209.41743: getting the next task for host managed-node3 41445 1727204209.41751: done getting next task for host managed-node3 41445 1727204209.41754: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 41445 1727204209.41757: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204209.41778: getting variables 41445 1727204209.41780: in VariableManager get_vars() 41445 1727204209.41821: Calling all_inventory to load vars for managed-node3 41445 1727204209.41823: Calling groups_inventory to load vars for managed-node3 41445 1727204209.41825: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204209.41835: Calling all_plugins_play to load vars for managed-node3 41445 1727204209.41837: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204209.41840: Calling groups_plugins_play to load vars for managed-node3 41445 1727204209.42402: done sending task result for task 028d2410-947f-bf02-eee4-0000000000bc 41445 1727204209.42406: WORKER PROCESS EXITING 41445 1727204209.43460: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204209.45107: done with get_vars() 41445 1727204209.45131: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 41445 1727204209.45206: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:56:49 -0400 (0:00:00.114) 0:00:28.240 ***** 41445 1727204209.45234: entering _queue_task() for managed-node3/yum 41445 1727204209.45562: worker is 1 (out of 1 available) 41445 1727204209.45580: exiting _queue_task() for managed-node3/yum 41445 1727204209.45594: done queuing things up, now waiting for results queue to drain 41445 1727204209.45596: waiting for pending results... 41445 1727204209.45904: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 41445 1727204209.45966: in run() - task 028d2410-947f-bf02-eee4-0000000000bd 41445 1727204209.45989: variable 'ansible_search_path' from source: unknown 41445 1727204209.46002: variable 'ansible_search_path' from source: unknown 41445 1727204209.46048: calling self._execute() 41445 1727204209.46162: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204209.46173: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204209.46215: variable 'omit' from source: magic vars 41445 1727204209.46592: variable 'ansible_distribution_major_version' from source: facts 41445 1727204209.46610: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204209.46803: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41445 1727204209.49251: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41445 1727204209.49298: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41445 1727204209.49336: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41445 1727204209.49364: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41445 1727204209.49386: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41445 1727204209.49444: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204209.49465: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204209.49487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204209.49515: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204209.49526: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204209.49593: variable 'ansible_distribution_major_version' from source: facts 41445 1727204209.49606: Evaluated conditional (ansible_distribution_major_version | int < 8): False 41445 1727204209.49609: when evaluation is False, skipping this task 41445 1727204209.49614: _execute() done 41445 1727204209.49617: dumping result to json 41445 1727204209.49619: done dumping result, returning 41445 1727204209.49624: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [028d2410-947f-bf02-eee4-0000000000bd] 41445 1727204209.49629: sending task result for task 028d2410-947f-bf02-eee4-0000000000bd 41445 1727204209.49717: done sending task result for task 028d2410-947f-bf02-eee4-0000000000bd 41445 1727204209.49720: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 41445 1727204209.49771: no more pending results, returning what we have 41445 1727204209.49774: results queue empty 41445 1727204209.49777: checking for any_errors_fatal 41445 1727204209.49782: done checking for any_errors_fatal 41445 1727204209.49783: checking for max_fail_percentage 41445 1727204209.49784: done checking for max_fail_percentage 41445 1727204209.49785: checking to see if all hosts have failed and the running result is not ok 41445 1727204209.49786: done checking to see if all hosts have failed 41445 1727204209.49786: getting the remaining hosts for this loop 41445 1727204209.49788: done getting the remaining hosts for this loop 41445 1727204209.49791: getting the next task for host managed-node3 41445 1727204209.49798: done getting next task for host managed-node3 41445 1727204209.49801: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 41445 1727204209.49803: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204209.49818: getting variables 41445 1727204209.49820: in VariableManager get_vars() 41445 1727204209.49860: Calling all_inventory to load vars for managed-node3 41445 1727204209.49863: Calling groups_inventory to load vars for managed-node3 41445 1727204209.49865: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204209.49874: Calling all_plugins_play to load vars for managed-node3 41445 1727204209.49879: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204209.49881: Calling groups_plugins_play to load vars for managed-node3 41445 1727204209.51011: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204209.52195: done with get_vars() 41445 1727204209.52214: done getting variables 41445 1727204209.52256: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:56:49 -0400 (0:00:00.070) 0:00:28.310 ***** 41445 1727204209.52279: entering _queue_task() for managed-node3/fail 41445 1727204209.52517: worker is 1 (out of 1 available) 41445 1727204209.52530: exiting _queue_task() for managed-node3/fail 41445 1727204209.52541: done queuing things up, now waiting for results queue to drain 41445 1727204209.52543: waiting for pending results... 41445 1727204209.52723: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 41445 1727204209.52800: in run() - task 028d2410-947f-bf02-eee4-0000000000be 41445 1727204209.52811: variable 'ansible_search_path' from source: unknown 41445 1727204209.52818: variable 'ansible_search_path' from source: unknown 41445 1727204209.52846: calling self._execute() 41445 1727204209.52930: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204209.52934: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204209.52942: variable 'omit' from source: magic vars 41445 1727204209.53231: variable 'ansible_distribution_major_version' from source: facts 41445 1727204209.53240: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204209.53325: variable '__network_wireless_connections_defined' from source: role '' defaults 41445 1727204209.53458: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41445 1727204209.55494: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41445 1727204209.55546: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41445 1727204209.55574: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41445 1727204209.55607: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41445 1727204209.55627: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41445 1727204209.55685: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204209.55712: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204209.55729: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204209.55754: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204209.55765: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204209.55803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204209.55821: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204209.55838: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204209.55862: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204209.55873: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204209.55904: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204209.55924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204209.55941: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204209.55964: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204209.55977: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204209.56089: variable 'network_connections' from source: play vars 41445 1727204209.56099: variable 'profile' from source: play vars 41445 1727204209.56151: variable 'profile' from source: play vars 41445 1727204209.56155: variable 'interface' from source: set_fact 41445 1727204209.56201: variable 'interface' from source: set_fact 41445 1727204209.56252: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41445 1727204209.56371: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41445 1727204209.56399: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41445 1727204209.56422: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41445 1727204209.56444: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41445 1727204209.56478: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41445 1727204209.56494: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41445 1727204209.56514: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204209.56530: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41445 1727204209.56573: variable '__network_team_connections_defined' from source: role '' defaults 41445 1727204209.56753: variable 'network_connections' from source: play vars 41445 1727204209.56757: variable 'profile' from source: play vars 41445 1727204209.56841: variable 'profile' from source: play vars 41445 1727204209.56844: variable 'interface' from source: set_fact 41445 1727204209.56911: variable 'interface' from source: set_fact 41445 1727204209.56915: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 41445 1727204209.56922: when evaluation is False, skipping this task 41445 1727204209.56925: _execute() done 41445 1727204209.56927: dumping result to json 41445 1727204209.56929: done dumping result, returning 41445 1727204209.56931: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [028d2410-947f-bf02-eee4-0000000000be] 41445 1727204209.56941: sending task result for task 028d2410-947f-bf02-eee4-0000000000be 41445 1727204209.57014: done sending task result for task 028d2410-947f-bf02-eee4-0000000000be 41445 1727204209.57018: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 41445 1727204209.57229: no more pending results, returning what we have 41445 1727204209.57232: results queue empty 41445 1727204209.57233: checking for any_errors_fatal 41445 1727204209.57238: done checking for any_errors_fatal 41445 1727204209.57239: checking for max_fail_percentage 41445 1727204209.57240: done checking for max_fail_percentage 41445 1727204209.57241: checking to see if all hosts have failed and the running result is not ok 41445 1727204209.57242: done checking to see if all hosts have failed 41445 1727204209.57242: getting the remaining hosts for this loop 41445 1727204209.57243: done getting the remaining hosts for this loop 41445 1727204209.57247: getting the next task for host managed-node3 41445 1727204209.57251: done getting next task for host managed-node3 41445 1727204209.57255: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 41445 1727204209.57256: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204209.57269: getting variables 41445 1727204209.57270: in VariableManager get_vars() 41445 1727204209.57315: Calling all_inventory to load vars for managed-node3 41445 1727204209.57317: Calling groups_inventory to load vars for managed-node3 41445 1727204209.57320: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204209.57328: Calling all_plugins_play to load vars for managed-node3 41445 1727204209.57330: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204209.57333: Calling groups_plugins_play to load vars for managed-node3 41445 1727204209.58584: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204209.59666: done with get_vars() 41445 1727204209.59685: done getting variables 41445 1727204209.59732: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:56:49 -0400 (0:00:00.074) 0:00:28.385 ***** 41445 1727204209.59755: entering _queue_task() for managed-node3/package 41445 1727204209.59993: worker is 1 (out of 1 available) 41445 1727204209.60008: exiting _queue_task() for managed-node3/package 41445 1727204209.60022: done queuing things up, now waiting for results queue to drain 41445 1727204209.60023: waiting for pending results... 41445 1727204209.60194: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages 41445 1727204209.60267: in run() - task 028d2410-947f-bf02-eee4-0000000000bf 41445 1727204209.60279: variable 'ansible_search_path' from source: unknown 41445 1727204209.60283: variable 'ansible_search_path' from source: unknown 41445 1727204209.60314: calling self._execute() 41445 1727204209.60393: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204209.60397: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204209.60405: variable 'omit' from source: magic vars 41445 1727204209.60670: variable 'ansible_distribution_major_version' from source: facts 41445 1727204209.60682: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204209.60871: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41445 1727204209.61381: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41445 1727204209.61384: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41445 1727204209.61387: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41445 1727204209.61565: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41445 1727204209.61680: variable 'network_packages' from source: role '' defaults 41445 1727204209.61787: variable '__network_provider_setup' from source: role '' defaults 41445 1727204209.61792: variable '__network_service_name_default_nm' from source: role '' defaults 41445 1727204209.61856: variable '__network_service_name_default_nm' from source: role '' defaults 41445 1727204209.61864: variable '__network_packages_default_nm' from source: role '' defaults 41445 1727204209.61912: variable '__network_packages_default_nm' from source: role '' defaults 41445 1727204209.62027: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41445 1727204209.63326: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41445 1727204209.63371: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41445 1727204209.63399: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41445 1727204209.63422: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41445 1727204209.63441: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41445 1727204209.63501: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204209.63531: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204209.63548: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204209.63579: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204209.63590: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204209.63622: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204209.63638: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204209.63653: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204209.63684: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204209.63695: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204209.63832: variable '__network_packages_default_gobject_packages' from source: role '' defaults 41445 1727204209.63905: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204209.63924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204209.63941: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204209.63965: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204209.63976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204209.64038: variable 'ansible_python' from source: facts 41445 1727204209.64058: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 41445 1727204209.64117: variable '__network_wpa_supplicant_required' from source: role '' defaults 41445 1727204209.64169: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 41445 1727204209.64255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204209.64272: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204209.64290: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204209.64321: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204209.64329: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204209.64360: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204209.64381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204209.64397: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204209.64423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204209.64437: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204209.64528: variable 'network_connections' from source: play vars 41445 1727204209.64535: variable 'profile' from source: play vars 41445 1727204209.64605: variable 'profile' from source: play vars 41445 1727204209.64614: variable 'interface' from source: set_fact 41445 1727204209.64661: variable 'interface' from source: set_fact 41445 1727204209.64714: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41445 1727204209.64732: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41445 1727204209.64752: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204209.64778: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41445 1727204209.64816: variable '__network_wireless_connections_defined' from source: role '' defaults 41445 1727204209.64990: variable 'network_connections' from source: play vars 41445 1727204209.64994: variable 'profile' from source: play vars 41445 1727204209.65062: variable 'profile' from source: play vars 41445 1727204209.65068: variable 'interface' from source: set_fact 41445 1727204209.65121: variable 'interface' from source: set_fact 41445 1727204209.65145: variable '__network_packages_default_wireless' from source: role '' defaults 41445 1727204209.65203: variable '__network_wireless_connections_defined' from source: role '' defaults 41445 1727204209.65391: variable 'network_connections' from source: play vars 41445 1727204209.65394: variable 'profile' from source: play vars 41445 1727204209.65442: variable 'profile' from source: play vars 41445 1727204209.65445: variable 'interface' from source: set_fact 41445 1727204209.65516: variable 'interface' from source: set_fact 41445 1727204209.65535: variable '__network_packages_default_team' from source: role '' defaults 41445 1727204209.65590: variable '__network_team_connections_defined' from source: role '' defaults 41445 1727204209.65783: variable 'network_connections' from source: play vars 41445 1727204209.65786: variable 'profile' from source: play vars 41445 1727204209.65832: variable 'profile' from source: play vars 41445 1727204209.65836: variable 'interface' from source: set_fact 41445 1727204209.65907: variable 'interface' from source: set_fact 41445 1727204209.65944: variable '__network_service_name_default_initscripts' from source: role '' defaults 41445 1727204209.65989: variable '__network_service_name_default_initscripts' from source: role '' defaults 41445 1727204209.65995: variable '__network_packages_default_initscripts' from source: role '' defaults 41445 1727204209.66037: variable '__network_packages_default_initscripts' from source: role '' defaults 41445 1727204209.66171: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 41445 1727204209.66465: variable 'network_connections' from source: play vars 41445 1727204209.66468: variable 'profile' from source: play vars 41445 1727204209.66579: variable 'profile' from source: play vars 41445 1727204209.66582: variable 'interface' from source: set_fact 41445 1727204209.66583: variable 'interface' from source: set_fact 41445 1727204209.66584: variable 'ansible_distribution' from source: facts 41445 1727204209.66585: variable '__network_rh_distros' from source: role '' defaults 41445 1727204209.66587: variable 'ansible_distribution_major_version' from source: facts 41445 1727204209.66588: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 41445 1727204209.66696: variable 'ansible_distribution' from source: facts 41445 1727204209.66699: variable '__network_rh_distros' from source: role '' defaults 41445 1727204209.66705: variable 'ansible_distribution_major_version' from source: facts 41445 1727204209.66717: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 41445 1727204209.66822: variable 'ansible_distribution' from source: facts 41445 1727204209.66825: variable '__network_rh_distros' from source: role '' defaults 41445 1727204209.66828: variable 'ansible_distribution_major_version' from source: facts 41445 1727204209.66857: variable 'network_provider' from source: set_fact 41445 1727204209.66868: variable 'ansible_facts' from source: unknown 41445 1727204209.67238: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 41445 1727204209.67242: when evaluation is False, skipping this task 41445 1727204209.67244: _execute() done 41445 1727204209.67246: dumping result to json 41445 1727204209.67248: done dumping result, returning 41445 1727204209.67255: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages [028d2410-947f-bf02-eee4-0000000000bf] 41445 1727204209.67261: sending task result for task 028d2410-947f-bf02-eee4-0000000000bf 41445 1727204209.67350: done sending task result for task 028d2410-947f-bf02-eee4-0000000000bf 41445 1727204209.67353: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 41445 1727204209.67427: no more pending results, returning what we have 41445 1727204209.67430: results queue empty 41445 1727204209.67431: checking for any_errors_fatal 41445 1727204209.67436: done checking for any_errors_fatal 41445 1727204209.67437: checking for max_fail_percentage 41445 1727204209.67439: done checking for max_fail_percentage 41445 1727204209.67439: checking to see if all hosts have failed and the running result is not ok 41445 1727204209.67440: done checking to see if all hosts have failed 41445 1727204209.67441: getting the remaining hosts for this loop 41445 1727204209.67442: done getting the remaining hosts for this loop 41445 1727204209.67446: getting the next task for host managed-node3 41445 1727204209.67452: done getting next task for host managed-node3 41445 1727204209.67456: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 41445 1727204209.67457: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204209.67470: getting variables 41445 1727204209.67471: in VariableManager get_vars() 41445 1727204209.67514: Calling all_inventory to load vars for managed-node3 41445 1727204209.67517: Calling groups_inventory to load vars for managed-node3 41445 1727204209.67519: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204209.67533: Calling all_plugins_play to load vars for managed-node3 41445 1727204209.67535: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204209.67537: Calling groups_plugins_play to load vars for managed-node3 41445 1727204209.68443: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204209.69332: done with get_vars() 41445 1727204209.69347: done getting variables 41445 1727204209.69391: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:56:49 -0400 (0:00:00.096) 0:00:28.481 ***** 41445 1727204209.69414: entering _queue_task() for managed-node3/package 41445 1727204209.69649: worker is 1 (out of 1 available) 41445 1727204209.69662: exiting _queue_task() for managed-node3/package 41445 1727204209.69678: done queuing things up, now waiting for results queue to drain 41445 1727204209.69679: waiting for pending results... 41445 1727204209.69852: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 41445 1727204209.69928: in run() - task 028d2410-947f-bf02-eee4-0000000000c0 41445 1727204209.69939: variable 'ansible_search_path' from source: unknown 41445 1727204209.69943: variable 'ansible_search_path' from source: unknown 41445 1727204209.69969: calling self._execute() 41445 1727204209.70050: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204209.70054: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204209.70063: variable 'omit' from source: magic vars 41445 1727204209.70335: variable 'ansible_distribution_major_version' from source: facts 41445 1727204209.70346: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204209.70426: variable 'network_state' from source: role '' defaults 41445 1727204209.70434: Evaluated conditional (network_state != {}): False 41445 1727204209.70438: when evaluation is False, skipping this task 41445 1727204209.70440: _execute() done 41445 1727204209.70442: dumping result to json 41445 1727204209.70447: done dumping result, returning 41445 1727204209.70458: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [028d2410-947f-bf02-eee4-0000000000c0] 41445 1727204209.70461: sending task result for task 028d2410-947f-bf02-eee4-0000000000c0 41445 1727204209.70544: done sending task result for task 028d2410-947f-bf02-eee4-0000000000c0 41445 1727204209.70546: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 41445 1727204209.70606: no more pending results, returning what we have 41445 1727204209.70613: results queue empty 41445 1727204209.70614: checking for any_errors_fatal 41445 1727204209.70619: done checking for any_errors_fatal 41445 1727204209.70620: checking for max_fail_percentage 41445 1727204209.70621: done checking for max_fail_percentage 41445 1727204209.70623: checking to see if all hosts have failed and the running result is not ok 41445 1727204209.70623: done checking to see if all hosts have failed 41445 1727204209.70624: getting the remaining hosts for this loop 41445 1727204209.70625: done getting the remaining hosts for this loop 41445 1727204209.70629: getting the next task for host managed-node3 41445 1727204209.70634: done getting next task for host managed-node3 41445 1727204209.70638: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 41445 1727204209.70640: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204209.70653: getting variables 41445 1727204209.70654: in VariableManager get_vars() 41445 1727204209.70687: Calling all_inventory to load vars for managed-node3 41445 1727204209.70689: Calling groups_inventory to load vars for managed-node3 41445 1727204209.70691: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204209.70699: Calling all_plugins_play to load vars for managed-node3 41445 1727204209.70701: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204209.70704: Calling groups_plugins_play to load vars for managed-node3 41445 1727204209.71473: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204209.72457: done with get_vars() 41445 1727204209.72472: done getting variables 41445 1727204209.72518: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:56:49 -0400 (0:00:00.031) 0:00:28.513 ***** 41445 1727204209.72538: entering _queue_task() for managed-node3/package 41445 1727204209.72748: worker is 1 (out of 1 available) 41445 1727204209.72761: exiting _queue_task() for managed-node3/package 41445 1727204209.72773: done queuing things up, now waiting for results queue to drain 41445 1727204209.72774: waiting for pending results... 41445 1727204209.72937: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 41445 1727204209.72998: in run() - task 028d2410-947f-bf02-eee4-0000000000c1 41445 1727204209.73014: variable 'ansible_search_path' from source: unknown 41445 1727204209.73018: variable 'ansible_search_path' from source: unknown 41445 1727204209.73040: calling self._execute() 41445 1727204209.73123: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204209.73127: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204209.73130: variable 'omit' from source: magic vars 41445 1727204209.73392: variable 'ansible_distribution_major_version' from source: facts 41445 1727204209.73402: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204209.73484: variable 'network_state' from source: role '' defaults 41445 1727204209.73491: Evaluated conditional (network_state != {}): False 41445 1727204209.73495: when evaluation is False, skipping this task 41445 1727204209.73498: _execute() done 41445 1727204209.73500: dumping result to json 41445 1727204209.73503: done dumping result, returning 41445 1727204209.73513: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [028d2410-947f-bf02-eee4-0000000000c1] 41445 1727204209.73516: sending task result for task 028d2410-947f-bf02-eee4-0000000000c1 41445 1727204209.73600: done sending task result for task 028d2410-947f-bf02-eee4-0000000000c1 41445 1727204209.73602: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 41445 1727204209.73652: no more pending results, returning what we have 41445 1727204209.73656: results queue empty 41445 1727204209.73657: checking for any_errors_fatal 41445 1727204209.73663: done checking for any_errors_fatal 41445 1727204209.73663: checking for max_fail_percentage 41445 1727204209.73665: done checking for max_fail_percentage 41445 1727204209.73665: checking to see if all hosts have failed and the running result is not ok 41445 1727204209.73666: done checking to see if all hosts have failed 41445 1727204209.73667: getting the remaining hosts for this loop 41445 1727204209.73668: done getting the remaining hosts for this loop 41445 1727204209.73671: getting the next task for host managed-node3 41445 1727204209.73679: done getting next task for host managed-node3 41445 1727204209.73682: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 41445 1727204209.73683: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204209.73695: getting variables 41445 1727204209.73697: in VariableManager get_vars() 41445 1727204209.73730: Calling all_inventory to load vars for managed-node3 41445 1727204209.73732: Calling groups_inventory to load vars for managed-node3 41445 1727204209.73734: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204209.73743: Calling all_plugins_play to load vars for managed-node3 41445 1727204209.73745: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204209.73747: Calling groups_plugins_play to load vars for managed-node3 41445 1727204209.74498: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204209.75391: done with get_vars() 41445 1727204209.75409: done getting variables 41445 1727204209.75452: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:56:49 -0400 (0:00:00.029) 0:00:28.542 ***** 41445 1727204209.75473: entering _queue_task() for managed-node3/service 41445 1727204209.75695: worker is 1 (out of 1 available) 41445 1727204209.75713: exiting _queue_task() for managed-node3/service 41445 1727204209.75724: done queuing things up, now waiting for results queue to drain 41445 1727204209.75725: waiting for pending results... 41445 1727204209.75893: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 41445 1727204209.75968: in run() - task 028d2410-947f-bf02-eee4-0000000000c2 41445 1727204209.75982: variable 'ansible_search_path' from source: unknown 41445 1727204209.75986: variable 'ansible_search_path' from source: unknown 41445 1727204209.76015: calling self._execute() 41445 1727204209.76092: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204209.76096: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204209.76105: variable 'omit' from source: magic vars 41445 1727204209.76368: variable 'ansible_distribution_major_version' from source: facts 41445 1727204209.76379: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204209.76460: variable '__network_wireless_connections_defined' from source: role '' defaults 41445 1727204209.76590: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41445 1727204209.78060: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41445 1727204209.78347: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41445 1727204209.78374: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41445 1727204209.78402: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41445 1727204209.78422: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41445 1727204209.78681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204209.78685: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204209.78687: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204209.78689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204209.78691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204209.78693: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204209.78695: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204209.78710: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204209.78791: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204209.78816: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204209.78873: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204209.78903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204209.78933: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204209.78986: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204209.79005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204209.79193: variable 'network_connections' from source: play vars 41445 1727204209.79215: variable 'profile' from source: play vars 41445 1727204209.79316: variable 'profile' from source: play vars 41445 1727204209.79337: variable 'interface' from source: set_fact 41445 1727204209.79405: variable 'interface' from source: set_fact 41445 1727204209.79474: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41445 1727204209.79609: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41445 1727204209.79637: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41445 1727204209.79659: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41445 1727204209.79692: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41445 1727204209.79727: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41445 1727204209.79742: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41445 1727204209.79759: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204209.79777: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41445 1727204209.79822: variable '__network_team_connections_defined' from source: role '' defaults 41445 1727204209.79966: variable 'network_connections' from source: play vars 41445 1727204209.79969: variable 'profile' from source: play vars 41445 1727204209.80015: variable 'profile' from source: play vars 41445 1727204209.80019: variable 'interface' from source: set_fact 41445 1727204209.80060: variable 'interface' from source: set_fact 41445 1727204209.80079: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 41445 1727204209.80082: when evaluation is False, skipping this task 41445 1727204209.80085: _execute() done 41445 1727204209.80087: dumping result to json 41445 1727204209.80089: done dumping result, returning 41445 1727204209.80097: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [028d2410-947f-bf02-eee4-0000000000c2] 41445 1727204209.80107: sending task result for task 028d2410-947f-bf02-eee4-0000000000c2 41445 1727204209.80188: done sending task result for task 028d2410-947f-bf02-eee4-0000000000c2 41445 1727204209.80191: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 41445 1727204209.80236: no more pending results, returning what we have 41445 1727204209.80239: results queue empty 41445 1727204209.80240: checking for any_errors_fatal 41445 1727204209.80245: done checking for any_errors_fatal 41445 1727204209.80246: checking for max_fail_percentage 41445 1727204209.80247: done checking for max_fail_percentage 41445 1727204209.80248: checking to see if all hosts have failed and the running result is not ok 41445 1727204209.80249: done checking to see if all hosts have failed 41445 1727204209.80249: getting the remaining hosts for this loop 41445 1727204209.80251: done getting the remaining hosts for this loop 41445 1727204209.80254: getting the next task for host managed-node3 41445 1727204209.80260: done getting next task for host managed-node3 41445 1727204209.80264: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 41445 1727204209.80266: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204209.80281: getting variables 41445 1727204209.80282: in VariableManager get_vars() 41445 1727204209.80320: Calling all_inventory to load vars for managed-node3 41445 1727204209.80323: Calling groups_inventory to load vars for managed-node3 41445 1727204209.80325: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204209.80335: Calling all_plugins_play to load vars for managed-node3 41445 1727204209.80338: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204209.80341: Calling groups_plugins_play to load vars for managed-node3 41445 1727204209.81321: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204209.82988: done with get_vars() 41445 1727204209.83015: done getting variables 41445 1727204209.83073: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:56:49 -0400 (0:00:00.076) 0:00:28.618 ***** 41445 1727204209.83107: entering _queue_task() for managed-node3/service 41445 1727204209.83440: worker is 1 (out of 1 available) 41445 1727204209.83452: exiting _queue_task() for managed-node3/service 41445 1727204209.83464: done queuing things up, now waiting for results queue to drain 41445 1727204209.83465: waiting for pending results... 41445 1727204209.83897: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 41445 1727204209.83903: in run() - task 028d2410-947f-bf02-eee4-0000000000c3 41445 1727204209.83906: variable 'ansible_search_path' from source: unknown 41445 1727204209.83908: variable 'ansible_search_path' from source: unknown 41445 1727204209.83924: calling self._execute() 41445 1727204209.84042: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204209.84053: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204209.84068: variable 'omit' from source: magic vars 41445 1727204209.84452: variable 'ansible_distribution_major_version' from source: facts 41445 1727204209.84468: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204209.84640: variable 'network_provider' from source: set_fact 41445 1727204209.84654: variable 'network_state' from source: role '' defaults 41445 1727204209.84667: Evaluated conditional (network_provider == "nm" or network_state != {}): True 41445 1727204209.84756: variable 'omit' from source: magic vars 41445 1727204209.84759: variable 'omit' from source: magic vars 41445 1727204209.84762: variable 'network_service_name' from source: role '' defaults 41445 1727204209.84831: variable 'network_service_name' from source: role '' defaults 41445 1727204209.84944: variable '__network_provider_setup' from source: role '' defaults 41445 1727204209.84955: variable '__network_service_name_default_nm' from source: role '' defaults 41445 1727204209.85026: variable '__network_service_name_default_nm' from source: role '' defaults 41445 1727204209.85040: variable '__network_packages_default_nm' from source: role '' defaults 41445 1727204209.85112: variable '__network_packages_default_nm' from source: role '' defaults 41445 1727204209.85344: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41445 1727204209.87651: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41445 1727204209.87740: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41445 1727204209.87787: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41445 1727204209.87836: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41445 1727204209.87868: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41445 1727204209.88034: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204209.88038: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204209.88041: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204209.88073: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204209.88092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204209.88139: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204209.88168: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204209.88195: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204209.88236: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204209.88252: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204209.88483: variable '__network_packages_default_gobject_packages' from source: role '' defaults 41445 1727204209.88598: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204209.88629: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204209.88695: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204209.88698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204209.88718: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204209.88812: variable 'ansible_python' from source: facts 41445 1727204209.88839: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 41445 1727204209.88933: variable '__network_wpa_supplicant_required' from source: role '' defaults 41445 1727204209.89021: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 41445 1727204209.89389: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204209.89423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204209.89461: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204209.89526: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204209.89554: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204209.89616: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204209.89905: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204209.89908: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204209.89913: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204209.89915: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204209.89918: variable 'network_connections' from source: play vars 41445 1727204209.89920: variable 'profile' from source: play vars 41445 1727204209.89988: variable 'profile' from source: play vars 41445 1727204209.90000: variable 'interface' from source: set_fact 41445 1727204209.90094: variable 'interface' from source: set_fact 41445 1727204209.90247: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41445 1727204209.90469: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41445 1727204209.90529: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41445 1727204209.90585: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41445 1727204209.90634: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41445 1727204209.90706: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41445 1727204209.90744: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41445 1727204209.90788: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204209.90829: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41445 1727204209.90883: variable '__network_wireless_connections_defined' from source: role '' defaults 41445 1727204209.91161: variable 'network_connections' from source: play vars 41445 1727204209.91170: variable 'profile' from source: play vars 41445 1727204209.91256: variable 'profile' from source: play vars 41445 1727204209.91269: variable 'interface' from source: set_fact 41445 1727204209.91360: variable 'interface' from source: set_fact 41445 1727204209.91423: variable '__network_packages_default_wireless' from source: role '' defaults 41445 1727204209.91791: variable '__network_wireless_connections_defined' from source: role '' defaults 41445 1727204209.92304: variable 'network_connections' from source: play vars 41445 1727204209.92317: variable 'profile' from source: play vars 41445 1727204209.92390: variable 'profile' from source: play vars 41445 1727204209.92682: variable 'interface' from source: set_fact 41445 1727204209.92686: variable 'interface' from source: set_fact 41445 1727204209.92695: variable '__network_packages_default_team' from source: role '' defaults 41445 1727204209.92780: variable '__network_team_connections_defined' from source: role '' defaults 41445 1727204209.93466: variable 'network_connections' from source: play vars 41445 1727204209.93479: variable 'profile' from source: play vars 41445 1727204209.93666: variable 'profile' from source: play vars 41445 1727204209.93678: variable 'interface' from source: set_fact 41445 1727204209.93753: variable 'interface' from source: set_fact 41445 1727204209.93933: variable '__network_service_name_default_initscripts' from source: role '' defaults 41445 1727204209.94180: variable '__network_service_name_default_initscripts' from source: role '' defaults 41445 1727204209.94183: variable '__network_packages_default_initscripts' from source: role '' defaults 41445 1727204209.94186: variable '__network_packages_default_initscripts' from source: role '' defaults 41445 1727204209.94605: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 41445 1727204209.95752: variable 'network_connections' from source: play vars 41445 1727204209.95763: variable 'profile' from source: play vars 41445 1727204209.95833: variable 'profile' from source: play vars 41445 1727204209.96062: variable 'interface' from source: set_fact 41445 1727204209.96065: variable 'interface' from source: set_fact 41445 1727204209.96068: variable 'ansible_distribution' from source: facts 41445 1727204209.96070: variable '__network_rh_distros' from source: role '' defaults 41445 1727204209.96072: variable 'ansible_distribution_major_version' from source: facts 41445 1727204209.96186: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 41445 1727204209.96474: variable 'ansible_distribution' from source: facts 41445 1727204209.96516: variable '__network_rh_distros' from source: role '' defaults 41445 1727204209.96527: variable 'ansible_distribution_major_version' from source: facts 41445 1727204209.96545: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 41445 1727204209.96751: variable 'ansible_distribution' from source: facts 41445 1727204209.96761: variable '__network_rh_distros' from source: role '' defaults 41445 1727204209.96772: variable 'ansible_distribution_major_version' from source: facts 41445 1727204209.96820: variable 'network_provider' from source: set_fact 41445 1727204209.96853: variable 'omit' from source: magic vars 41445 1727204209.96888: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41445 1727204209.96924: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41445 1727204209.96953: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41445 1727204209.96978: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204209.96995: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204209.97032: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41445 1727204209.97044: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204209.97053: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204209.97163: Set connection var ansible_shell_executable to /bin/sh 41445 1727204209.97170: Set connection var ansible_shell_type to sh 41445 1727204209.97183: Set connection var ansible_pipelining to False 41445 1727204209.97195: Set connection var ansible_timeout to 10 41445 1727204209.97200: Set connection var ansible_connection to ssh 41445 1727204209.97213: Set connection var ansible_module_compression to ZIP_DEFLATED 41445 1727204209.97245: variable 'ansible_shell_executable' from source: unknown 41445 1727204209.97252: variable 'ansible_connection' from source: unknown 41445 1727204209.97264: variable 'ansible_module_compression' from source: unknown 41445 1727204209.97270: variable 'ansible_shell_type' from source: unknown 41445 1727204209.97278: variable 'ansible_shell_executable' from source: unknown 41445 1727204209.97286: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204209.97299: variable 'ansible_pipelining' from source: unknown 41445 1727204209.97307: variable 'ansible_timeout' from source: unknown 41445 1727204209.97319: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204209.97433: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41445 1727204209.97481: variable 'omit' from source: magic vars 41445 1727204209.97485: starting attempt loop 41445 1727204209.97487: running the handler 41445 1727204209.97553: variable 'ansible_facts' from source: unknown 41445 1727204209.98523: _low_level_execute_command(): starting 41445 1727204209.98536: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41445 1727204209.99298: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204209.99341: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204209.99369: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204209.99403: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204209.99451: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204210.01213: stdout chunk (state=3): >>>/root <<< 41445 1727204210.01319: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204210.01334: stdout chunk (state=3): >>><<< 41445 1727204210.01424: stderr chunk (state=3): >>><<< 41445 1727204210.01586: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204210.01590: _low_level_execute_command(): starting 41445 1727204210.01593: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204210.0144558-43199-255724004173824 `" && echo ansible-tmp-1727204210.0144558-43199-255724004173824="` echo /root/.ansible/tmp/ansible-tmp-1727204210.0144558-43199-255724004173824 `" ) && sleep 0' 41445 1727204210.02654: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204210.02667: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration <<< 41445 1727204210.02681: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204210.02757: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204210.02768: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204210.02871: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204210.04757: stdout chunk (state=3): >>>ansible-tmp-1727204210.0144558-43199-255724004173824=/root/.ansible/tmp/ansible-tmp-1727204210.0144558-43199-255724004173824 <<< 41445 1727204210.04973: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204210.04978: stdout chunk (state=3): >>><<< 41445 1727204210.04981: stderr chunk (state=3): >>><<< 41445 1727204210.05283: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204210.0144558-43199-255724004173824=/root/.ansible/tmp/ansible-tmp-1727204210.0144558-43199-255724004173824 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204210.05287: variable 'ansible_module_compression' from source: unknown 41445 1727204210.05289: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-414450s0ylvj1/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 41445 1727204210.05291: variable 'ansible_facts' from source: unknown 41445 1727204210.05787: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204210.0144558-43199-255724004173824/AnsiballZ_systemd.py 41445 1727204210.06174: Sending initial data 41445 1727204210.06189: Sent initial data (156 bytes) 41445 1727204210.07597: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204210.07638: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204210.07648: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204210.07671: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204210.07822: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204210.09410: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41445 1727204210.09469: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41445 1727204210.09489: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204210.0144558-43199-255724004173824/AnsiballZ_systemd.py" <<< 41445 1727204210.09556: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-414450s0ylvj1/tmp1nr05n8b /root/.ansible/tmp/ansible-tmp-1727204210.0144558-43199-255724004173824/AnsiballZ_systemd.py <<< 41445 1727204210.09615: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-414450s0ylvj1/tmp1nr05n8b" to remote "/root/.ansible/tmp/ansible-tmp-1727204210.0144558-43199-255724004173824/AnsiballZ_systemd.py" <<< 41445 1727204210.09628: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204210.0144558-43199-255724004173824/AnsiballZ_systemd.py" <<< 41445 1727204210.12720: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204210.12724: stderr chunk (state=3): >>><<< 41445 1727204210.12727: stdout chunk (state=3): >>><<< 41445 1727204210.12729: done transferring module to remote 41445 1727204210.12731: _low_level_execute_command(): starting 41445 1727204210.12733: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204210.0144558-43199-255724004173824/ /root/.ansible/tmp/ansible-tmp-1727204210.0144558-43199-255724004173824/AnsiballZ_systemd.py && sleep 0' 41445 1727204210.13950: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204210.13954: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204210.13956: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204210.13958: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found <<< 41445 1727204210.13961: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204210.14142: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204210.14223: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204210.14310: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204210.15993: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204210.16019: stderr chunk (state=3): >>><<< 41445 1727204210.16022: stdout chunk (state=3): >>><<< 41445 1727204210.16170: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204210.16174: _low_level_execute_command(): starting 41445 1727204210.16179: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204210.0144558-43199-255724004173824/AnsiballZ_systemd.py && sleep 0' 41445 1727204210.17290: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204210.17479: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204210.17490: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204210.17646: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204210.46402: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "704", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:44:11 EDT", "ExecMainStartTimestampMonotonic": "28990148", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Tue 2024-09-24 14:44:11 EDT", "ExecMainHandoffTimestampMonotonic": "29005881", "ExecMainPID": "704", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10510336", "MemoryPeak": "13586432", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3301072896", "EffectiveMemoryMax": "3702865920", "EffectiveMemoryHigh": "3702865920", "CPUUsageNSec": "1848700000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpR<<< 41445 1727204210.46407: stdout chunk (state=3): >>>eceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service shutdown.target NetworkManager-wait-online.service multi-user.target network.target", "After": "dbus-b<<< 41445 1727204210.46417: stdout chunk (state=3): >>>roker.service systemd-journald.socket network-pre.target basic.target cloud-init-local.service dbus.socket system.slice sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:49:45 EDT", "StateChangeTimestampMonotonic": "362725592", "InactiveExitTimestamp": "Tue 2024-09-24 14:44:11 EDT", "InactiveExitTimestampMonotonic": "28990654", "ActiveEnterTimestamp": "Tue 2024-09-24 14:44:12 EDT", "ActiveEnterTimestampMonotonic": "29769382", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:44:11 EDT", "ConditionTimestampMonotonic": "28989295", "AssertTimestamp": "Tue 2024-09-24 14:44:11 EDT", "AssertTimestampMonotonic": "28989297", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "70a845f8a1964db89963090ed497f47f", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 41445 1727204210.48256: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. <<< 41445 1727204210.48260: stderr chunk (state=3): >>><<< 41445 1727204210.48263: stdout chunk (state=3): >>><<< 41445 1727204210.48391: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "704", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:44:11 EDT", "ExecMainStartTimestampMonotonic": "28990148", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Tue 2024-09-24 14:44:11 EDT", "ExecMainHandoffTimestampMonotonic": "29005881", "ExecMainPID": "704", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10510336", "MemoryPeak": "13586432", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3301072896", "EffectiveMemoryMax": "3702865920", "EffectiveMemoryHigh": "3702865920", "CPUUsageNSec": "1848700000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service shutdown.target NetworkManager-wait-online.service multi-user.target network.target", "After": "dbus-broker.service systemd-journald.socket network-pre.target basic.target cloud-init-local.service dbus.socket system.slice sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:49:45 EDT", "StateChangeTimestampMonotonic": "362725592", "InactiveExitTimestamp": "Tue 2024-09-24 14:44:11 EDT", "InactiveExitTimestampMonotonic": "28990654", "ActiveEnterTimestamp": "Tue 2024-09-24 14:44:12 EDT", "ActiveEnterTimestampMonotonic": "29769382", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:44:11 EDT", "ConditionTimestampMonotonic": "28989295", "AssertTimestamp": "Tue 2024-09-24 14:44:11 EDT", "AssertTimestampMonotonic": "28989297", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "70a845f8a1964db89963090ed497f47f", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. 41445 1727204210.48469: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204210.0144558-43199-255724004173824/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41445 1727204210.48491: _low_level_execute_command(): starting 41445 1727204210.48495: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204210.0144558-43199-255724004173824/ > /dev/null 2>&1 && sleep 0' 41445 1727204210.49051: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204210.49061: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204210.49069: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204210.49084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204210.49096: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 <<< 41445 1727204210.49103: stderr chunk (state=3): >>>debug2: match not found <<< 41445 1727204210.49115: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204210.49267: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41445 1727204210.49270: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.22 is address <<< 41445 1727204210.49273: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41445 1727204210.49275: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204210.49280: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204210.49281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204210.49283: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 <<< 41445 1727204210.49285: stderr chunk (state=3): >>>debug2: match found <<< 41445 1727204210.49287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204210.49356: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204210.49359: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204210.49385: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204210.49453: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204210.51245: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204210.51308: stderr chunk (state=3): >>><<< 41445 1727204210.51317: stdout chunk (state=3): >>><<< 41445 1727204210.51335: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204210.51346: handler run complete 41445 1727204210.51413: attempt loop complete, returning result 41445 1727204210.51427: _execute() done 41445 1727204210.51432: dumping result to json 41445 1727204210.51451: done dumping result, returning 41445 1727204210.51462: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [028d2410-947f-bf02-eee4-0000000000c3] 41445 1727204210.51470: sending task result for task 028d2410-947f-bf02-eee4-0000000000c3 41445 1727204210.52018: done sending task result for task 028d2410-947f-bf02-eee4-0000000000c3 41445 1727204210.52021: WORKER PROCESS EXITING ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 41445 1727204210.52073: no more pending results, returning what we have 41445 1727204210.52079: results queue empty 41445 1727204210.52081: checking for any_errors_fatal 41445 1727204210.52087: done checking for any_errors_fatal 41445 1727204210.52088: checking for max_fail_percentage 41445 1727204210.52090: done checking for max_fail_percentage 41445 1727204210.52091: checking to see if all hosts have failed and the running result is not ok 41445 1727204210.52091: done checking to see if all hosts have failed 41445 1727204210.52092: getting the remaining hosts for this loop 41445 1727204210.52094: done getting the remaining hosts for this loop 41445 1727204210.52097: getting the next task for host managed-node3 41445 1727204210.52184: done getting next task for host managed-node3 41445 1727204210.52188: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 41445 1727204210.52190: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204210.52201: getting variables 41445 1727204210.52203: in VariableManager get_vars() 41445 1727204210.52248: Calling all_inventory to load vars for managed-node3 41445 1727204210.52251: Calling groups_inventory to load vars for managed-node3 41445 1727204210.52254: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204210.52264: Calling all_plugins_play to load vars for managed-node3 41445 1727204210.52267: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204210.52271: Calling groups_plugins_play to load vars for managed-node3 41445 1727204210.54151: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204210.56012: done with get_vars() 41445 1727204210.56037: done getting variables 41445 1727204210.56112: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:56:50 -0400 (0:00:00.730) 0:00:29.349 ***** 41445 1727204210.56151: entering _queue_task() for managed-node3/service 41445 1727204210.56606: worker is 1 (out of 1 available) 41445 1727204210.56619: exiting _queue_task() for managed-node3/service 41445 1727204210.56628: done queuing things up, now waiting for results queue to drain 41445 1727204210.56629: waiting for pending results... 41445 1727204210.56821: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 41445 1727204210.56943: in run() - task 028d2410-947f-bf02-eee4-0000000000c4 41445 1727204210.56970: variable 'ansible_search_path' from source: unknown 41445 1727204210.56980: variable 'ansible_search_path' from source: unknown 41445 1727204210.57048: calling self._execute() 41445 1727204210.57131: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204210.57142: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204210.57164: variable 'omit' from source: magic vars 41445 1727204210.57561: variable 'ansible_distribution_major_version' from source: facts 41445 1727204210.57590: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204210.57724: variable 'network_provider' from source: set_fact 41445 1727204210.57727: Evaluated conditional (network_provider == "nm"): True 41445 1727204210.57882: variable '__network_wpa_supplicant_required' from source: role '' defaults 41445 1727204210.57943: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 41445 1727204210.58124: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41445 1727204210.60435: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41445 1727204210.60514: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41445 1727204210.60561: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41445 1727204210.60603: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41445 1727204210.60665: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41445 1727204210.60737: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204210.60782: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204210.60814: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204210.60857: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204210.60980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204210.60985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204210.60988: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204210.60990: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204210.61024: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204210.61106: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204210.61109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204210.61111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204210.61126: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204210.61161: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204210.61179: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204210.61341: variable 'network_connections' from source: play vars 41445 1727204210.61361: variable 'profile' from source: play vars 41445 1727204210.61453: variable 'profile' from source: play vars 41445 1727204210.61464: variable 'interface' from source: set_fact 41445 1727204210.61545: variable 'interface' from source: set_fact 41445 1727204210.61628: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41445 1727204210.61818: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41445 1727204210.61869: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41445 1727204210.61979: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41445 1727204210.61982: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41445 1727204210.61992: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41445 1727204210.62020: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41445 1727204210.62050: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204210.62090: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41445 1727204210.62146: variable '__network_wireless_connections_defined' from source: role '' defaults 41445 1727204210.62427: variable 'network_connections' from source: play vars 41445 1727204210.62438: variable 'profile' from source: play vars 41445 1727204210.62503: variable 'profile' from source: play vars 41445 1727204210.62520: variable 'interface' from source: set_fact 41445 1727204210.62582: variable 'interface' from source: set_fact 41445 1727204210.62617: Evaluated conditional (__network_wpa_supplicant_required): False 41445 1727204210.62681: when evaluation is False, skipping this task 41445 1727204210.62684: _execute() done 41445 1727204210.62696: dumping result to json 41445 1727204210.62698: done dumping result, returning 41445 1727204210.62701: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [028d2410-947f-bf02-eee4-0000000000c4] 41445 1727204210.62704: sending task result for task 028d2410-947f-bf02-eee4-0000000000c4 41445 1727204210.62896: done sending task result for task 028d2410-947f-bf02-eee4-0000000000c4 41445 1727204210.62899: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 41445 1727204210.62951: no more pending results, returning what we have 41445 1727204210.62954: results queue empty 41445 1727204210.62955: checking for any_errors_fatal 41445 1727204210.62980: done checking for any_errors_fatal 41445 1727204210.62982: checking for max_fail_percentage 41445 1727204210.62984: done checking for max_fail_percentage 41445 1727204210.62985: checking to see if all hosts have failed and the running result is not ok 41445 1727204210.62986: done checking to see if all hosts have failed 41445 1727204210.62986: getting the remaining hosts for this loop 41445 1727204210.62988: done getting the remaining hosts for this loop 41445 1727204210.62993: getting the next task for host managed-node3 41445 1727204210.63000: done getting next task for host managed-node3 41445 1727204210.63005: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 41445 1727204210.63007: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204210.63191: getting variables 41445 1727204210.63194: in VariableManager get_vars() 41445 1727204210.63232: Calling all_inventory to load vars for managed-node3 41445 1727204210.63235: Calling groups_inventory to load vars for managed-node3 41445 1727204210.63238: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204210.63247: Calling all_plugins_play to load vars for managed-node3 41445 1727204210.63250: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204210.63253: Calling groups_plugins_play to load vars for managed-node3 41445 1727204210.64751: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204210.66886: done with get_vars() 41445 1727204210.66918: done getting variables 41445 1727204210.67174: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:56:50 -0400 (0:00:00.111) 0:00:29.460 ***** 41445 1727204210.67294: entering _queue_task() for managed-node3/service 41445 1727204210.67787: worker is 1 (out of 1 available) 41445 1727204210.67803: exiting _queue_task() for managed-node3/service 41445 1727204210.67817: done queuing things up, now waiting for results queue to drain 41445 1727204210.67819: waiting for pending results... 41445 1727204210.68136: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service 41445 1727204210.68194: in run() - task 028d2410-947f-bf02-eee4-0000000000c5 41445 1727204210.68218: variable 'ansible_search_path' from source: unknown 41445 1727204210.68342: variable 'ansible_search_path' from source: unknown 41445 1727204210.68346: calling self._execute() 41445 1727204210.68388: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204210.68398: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204210.68415: variable 'omit' from source: magic vars 41445 1727204210.68814: variable 'ansible_distribution_major_version' from source: facts 41445 1727204210.68833: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204210.68961: variable 'network_provider' from source: set_fact 41445 1727204210.68972: Evaluated conditional (network_provider == "initscripts"): False 41445 1727204210.68981: when evaluation is False, skipping this task 41445 1727204210.68995: _execute() done 41445 1727204210.69007: dumping result to json 41445 1727204210.69018: done dumping result, returning 41445 1727204210.69029: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service [028d2410-947f-bf02-eee4-0000000000c5] 41445 1727204210.69099: sending task result for task 028d2410-947f-bf02-eee4-0000000000c5 41445 1727204210.69169: done sending task result for task 028d2410-947f-bf02-eee4-0000000000c5 41445 1727204210.69172: WORKER PROCESS EXITING skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 41445 1727204210.69228: no more pending results, returning what we have 41445 1727204210.69232: results queue empty 41445 1727204210.69233: checking for any_errors_fatal 41445 1727204210.69243: done checking for any_errors_fatal 41445 1727204210.69243: checking for max_fail_percentage 41445 1727204210.69246: done checking for max_fail_percentage 41445 1727204210.69247: checking to see if all hosts have failed and the running result is not ok 41445 1727204210.69248: done checking to see if all hosts have failed 41445 1727204210.69248: getting the remaining hosts for this loop 41445 1727204210.69250: done getting the remaining hosts for this loop 41445 1727204210.69254: getting the next task for host managed-node3 41445 1727204210.69261: done getting next task for host managed-node3 41445 1727204210.69264: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 41445 1727204210.69267: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204210.69284: getting variables 41445 1727204210.69286: in VariableManager get_vars() 41445 1727204210.69395: Calling all_inventory to load vars for managed-node3 41445 1727204210.69399: Calling groups_inventory to load vars for managed-node3 41445 1727204210.69401: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204210.69417: Calling all_plugins_play to load vars for managed-node3 41445 1727204210.69536: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204210.69541: Calling groups_plugins_play to load vars for managed-node3 41445 1727204210.73449: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204210.76713: done with get_vars() 41445 1727204210.76747: done getting variables 41445 1727204210.76816: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:56:50 -0400 (0:00:00.095) 0:00:29.556 ***** 41445 1727204210.76848: entering _queue_task() for managed-node3/copy 41445 1727204210.77911: worker is 1 (out of 1 available) 41445 1727204210.77921: exiting _queue_task() for managed-node3/copy 41445 1727204210.77931: done queuing things up, now waiting for results queue to drain 41445 1727204210.77933: waiting for pending results... 41445 1727204210.78497: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 41445 1727204210.78703: in run() - task 028d2410-947f-bf02-eee4-0000000000c6 41445 1727204210.78707: variable 'ansible_search_path' from source: unknown 41445 1727204210.78712: variable 'ansible_search_path' from source: unknown 41445 1727204210.78715: calling self._execute() 41445 1727204210.78822: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204210.78834: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204210.78851: variable 'omit' from source: magic vars 41445 1727204210.79283: variable 'ansible_distribution_major_version' from source: facts 41445 1727204210.79308: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204210.79445: variable 'network_provider' from source: set_fact 41445 1727204210.79456: Evaluated conditional (network_provider == "initscripts"): False 41445 1727204210.79469: when evaluation is False, skipping this task 41445 1727204210.79478: _execute() done 41445 1727204210.79487: dumping result to json 41445 1727204210.79495: done dumping result, returning 41445 1727204210.79578: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [028d2410-947f-bf02-eee4-0000000000c6] 41445 1727204210.79581: sending task result for task 028d2410-947f-bf02-eee4-0000000000c6 41445 1727204210.79664: done sending task result for task 028d2410-947f-bf02-eee4-0000000000c6 41445 1727204210.79667: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 41445 1727204210.79732: no more pending results, returning what we have 41445 1727204210.79737: results queue empty 41445 1727204210.79738: checking for any_errors_fatal 41445 1727204210.79747: done checking for any_errors_fatal 41445 1727204210.79748: checking for max_fail_percentage 41445 1727204210.79750: done checking for max_fail_percentage 41445 1727204210.79751: checking to see if all hosts have failed and the running result is not ok 41445 1727204210.79752: done checking to see if all hosts have failed 41445 1727204210.79753: getting the remaining hosts for this loop 41445 1727204210.79754: done getting the remaining hosts for this loop 41445 1727204210.79758: getting the next task for host managed-node3 41445 1727204210.79765: done getting next task for host managed-node3 41445 1727204210.79769: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 41445 1727204210.79771: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204210.79790: getting variables 41445 1727204210.79793: in VariableManager get_vars() 41445 1727204210.79837: Calling all_inventory to load vars for managed-node3 41445 1727204210.79840: Calling groups_inventory to load vars for managed-node3 41445 1727204210.79843: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204210.79857: Calling all_plugins_play to load vars for managed-node3 41445 1727204210.79861: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204210.79864: Calling groups_plugins_play to load vars for managed-node3 41445 1727204210.81673: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204210.83385: done with get_vars() 41445 1727204210.83418: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:56:50 -0400 (0:00:00.066) 0:00:29.622 ***** 41445 1727204210.83513: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 41445 1727204210.83870: worker is 1 (out of 1 available) 41445 1727204210.83995: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 41445 1727204210.84006: done queuing things up, now waiting for results queue to drain 41445 1727204210.84007: waiting for pending results... 41445 1727204210.84293: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 41445 1727204210.84303: in run() - task 028d2410-947f-bf02-eee4-0000000000c7 41445 1727204210.84330: variable 'ansible_search_path' from source: unknown 41445 1727204210.84336: variable 'ansible_search_path' from source: unknown 41445 1727204210.84390: calling self._execute() 41445 1727204210.84490: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204210.84536: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204210.84540: variable 'omit' from source: magic vars 41445 1727204210.84970: variable 'ansible_distribution_major_version' from source: facts 41445 1727204210.84976: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204210.84979: variable 'omit' from source: magic vars 41445 1727204210.85000: variable 'omit' from source: magic vars 41445 1727204210.85171: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41445 1727204210.89184: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41445 1727204210.89229: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41445 1727204210.89526: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41445 1727204210.89635: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41445 1727204210.89638: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41445 1727204210.89688: variable 'network_provider' from source: set_fact 41445 1727204210.89945: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204210.90178: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204210.90181: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204210.90184: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204210.90390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204210.90393: variable 'omit' from source: magic vars 41445 1727204210.90580: variable 'omit' from source: magic vars 41445 1727204210.90822: variable 'network_connections' from source: play vars 41445 1727204210.90895: variable 'profile' from source: play vars 41445 1727204210.91155: variable 'profile' from source: play vars 41445 1727204210.91158: variable 'interface' from source: set_fact 41445 1727204210.91160: variable 'interface' from source: set_fact 41445 1727204210.91508: variable 'omit' from source: magic vars 41445 1727204210.91526: variable '__lsr_ansible_managed' from source: task vars 41445 1727204210.91781: variable '__lsr_ansible_managed' from source: task vars 41445 1727204210.92196: Loaded config def from plugin (lookup/template) 41445 1727204210.92360: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 41445 1727204210.92395: File lookup term: get_ansible_managed.j2 41445 1727204210.92403: variable 'ansible_search_path' from source: unknown 41445 1727204210.92419: evaluation_path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 41445 1727204210.92439: search_path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 41445 1727204210.92466: variable 'ansible_search_path' from source: unknown 41445 1727204211.05746: variable 'ansible_managed' from source: unknown 41445 1727204211.06093: variable 'omit' from source: magic vars 41445 1727204211.06167: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41445 1727204211.06273: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41445 1727204211.06298: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41445 1727204211.06326: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204211.06365: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204211.06401: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41445 1727204211.06468: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204211.06479: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204211.06694: Set connection var ansible_shell_executable to /bin/sh 41445 1727204211.06703: Set connection var ansible_shell_type to sh 41445 1727204211.06718: Set connection var ansible_pipelining to False 41445 1727204211.06731: Set connection var ansible_timeout to 10 41445 1727204211.06739: Set connection var ansible_connection to ssh 41445 1727204211.06897: Set connection var ansible_module_compression to ZIP_DEFLATED 41445 1727204211.06901: variable 'ansible_shell_executable' from source: unknown 41445 1727204211.06903: variable 'ansible_connection' from source: unknown 41445 1727204211.06905: variable 'ansible_module_compression' from source: unknown 41445 1727204211.06907: variable 'ansible_shell_type' from source: unknown 41445 1727204211.06912: variable 'ansible_shell_executable' from source: unknown 41445 1727204211.06914: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204211.06915: variable 'ansible_pipelining' from source: unknown 41445 1727204211.06917: variable 'ansible_timeout' from source: unknown 41445 1727204211.06919: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204211.07282: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 41445 1727204211.07294: variable 'omit' from source: magic vars 41445 1727204211.07297: starting attempt loop 41445 1727204211.07300: running the handler 41445 1727204211.07302: _low_level_execute_command(): starting 41445 1727204211.07304: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41445 1727204211.08636: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204211.09144: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204211.09167: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204211.09179: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204211.09203: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204211.09741: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204211.11416: stdout chunk (state=3): >>>/root <<< 41445 1727204211.11552: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204211.11558: stdout chunk (state=3): >>><<< 41445 1727204211.11567: stderr chunk (state=3): >>><<< 41445 1727204211.11591: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204211.11604: _low_level_execute_command(): starting 41445 1727204211.11613: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204211.1159232-43238-276464294248655 `" && echo ansible-tmp-1727204211.1159232-43238-276464294248655="` echo /root/.ansible/tmp/ansible-tmp-1727204211.1159232-43238-276464294248655 `" ) && sleep 0' 41445 1727204211.13058: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204211.13062: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204211.13065: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204211.13068: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204211.13070: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found <<< 41445 1727204211.13072: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204211.13227: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204211.13231: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204211.13303: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204211.13341: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204211.15348: stdout chunk (state=3): >>>ansible-tmp-1727204211.1159232-43238-276464294248655=/root/.ansible/tmp/ansible-tmp-1727204211.1159232-43238-276464294248655 <<< 41445 1727204211.15439: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204211.15443: stderr chunk (state=3): >>><<< 41445 1727204211.15446: stdout chunk (state=3): >>><<< 41445 1727204211.15674: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204211.1159232-43238-276464294248655=/root/.ansible/tmp/ansible-tmp-1727204211.1159232-43238-276464294248655 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204211.15681: variable 'ansible_module_compression' from source: unknown 41445 1727204211.15685: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-414450s0ylvj1/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 41445 1727204211.15887: variable 'ansible_facts' from source: unknown 41445 1727204211.16081: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204211.1159232-43238-276464294248655/AnsiballZ_network_connections.py 41445 1727204211.16447: Sending initial data 41445 1727204211.16451: Sent initial data (168 bytes) 41445 1727204211.17585: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204211.17589: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204211.17595: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration <<< 41445 1727204211.17597: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204211.17706: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found <<< 41445 1727204211.17714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204211.17778: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204211.17782: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204211.17902: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204211.17965: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204211.19473: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41445 1727204211.19519: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41445 1727204211.19566: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-414450s0ylvj1/tmp5jgyov3_ /root/.ansible/tmp/ansible-tmp-1727204211.1159232-43238-276464294248655/AnsiballZ_network_connections.py <<< 41445 1727204211.19570: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204211.1159232-43238-276464294248655/AnsiballZ_network_connections.py" <<< 41445 1727204211.19655: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-414450s0ylvj1/tmp5jgyov3_" to remote "/root/.ansible/tmp/ansible-tmp-1727204211.1159232-43238-276464294248655/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204211.1159232-43238-276464294248655/AnsiballZ_network_connections.py" <<< 41445 1727204211.21354: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204211.21379: stderr chunk (state=3): >>><<< 41445 1727204211.21383: stdout chunk (state=3): >>><<< 41445 1727204211.21435: done transferring module to remote 41445 1727204211.21446: _low_level_execute_command(): starting 41445 1727204211.21451: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204211.1159232-43238-276464294248655/ /root/.ansible/tmp/ansible-tmp-1727204211.1159232-43238-276464294248655/AnsiballZ_network_connections.py && sleep 0' 41445 1727204211.22666: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204211.22677: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204211.22688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204211.22707: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204211.22717: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 <<< 41445 1727204211.22722: stderr chunk (state=3): >>>debug2: match not found <<< 41445 1727204211.22781: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204211.22785: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41445 1727204211.22787: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.22 is address <<< 41445 1727204211.22790: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41445 1727204211.22792: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204211.22794: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204211.22797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204211.22799: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 <<< 41445 1727204211.22801: stderr chunk (state=3): >>>debug2: match found <<< 41445 1727204211.22927: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204211.22994: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204211.23003: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204211.23061: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204211.24824: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204211.24878: stderr chunk (state=3): >>><<< 41445 1727204211.24881: stdout chunk (state=3): >>><<< 41445 1727204211.25017: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204211.25021: _low_level_execute_command(): starting 41445 1727204211.25023: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204211.1159232-43238-276464294248655/AnsiballZ_network_connections.py && sleep 0' 41445 1727204211.26243: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204211.26500: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204211.26512: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204211.26516: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204211.26606: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204211.26819: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204211.60935: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 41445 1727204211.62774: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. <<< 41445 1727204211.62780: stdout chunk (state=3): >>><<< 41445 1727204211.62787: stderr chunk (state=3): >>><<< 41445 1727204211.62803: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. 41445 1727204211.62835: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'ethtest0', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204211.1159232-43238-276464294248655/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41445 1727204211.62842: _low_level_execute_command(): starting 41445 1727204211.62847: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204211.1159232-43238-276464294248655/ > /dev/null 2>&1 && sleep 0' 41445 1727204211.63295: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204211.63299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204211.63310: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204211.63372: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204211.63380: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204211.63382: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204211.63408: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204211.65234: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204211.65238: stdout chunk (state=3): >>><<< 41445 1727204211.65481: stderr chunk (state=3): >>><<< 41445 1727204211.65484: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204211.65486: handler run complete 41445 1727204211.65489: attempt loop complete, returning result 41445 1727204211.65490: _execute() done 41445 1727204211.65492: dumping result to json 41445 1727204211.65494: done dumping result, returning 41445 1727204211.65496: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [028d2410-947f-bf02-eee4-0000000000c7] 41445 1727204211.65497: sending task result for task 028d2410-947f-bf02-eee4-0000000000c7 41445 1727204211.65563: done sending task result for task 028d2410-947f-bf02-eee4-0000000000c7 41445 1727204211.65566: WORKER PROCESS EXITING changed: [managed-node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 41445 1727204211.65684: no more pending results, returning what we have 41445 1727204211.65687: results queue empty 41445 1727204211.65688: checking for any_errors_fatal 41445 1727204211.65696: done checking for any_errors_fatal 41445 1727204211.65697: checking for max_fail_percentage 41445 1727204211.65699: done checking for max_fail_percentage 41445 1727204211.65699: checking to see if all hosts have failed and the running result is not ok 41445 1727204211.65700: done checking to see if all hosts have failed 41445 1727204211.65701: getting the remaining hosts for this loop 41445 1727204211.65702: done getting the remaining hosts for this loop 41445 1727204211.65706: getting the next task for host managed-node3 41445 1727204211.65714: done getting next task for host managed-node3 41445 1727204211.65718: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 41445 1727204211.65720: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204211.65729: getting variables 41445 1727204211.65731: in VariableManager get_vars() 41445 1727204211.65767: Calling all_inventory to load vars for managed-node3 41445 1727204211.65770: Calling groups_inventory to load vars for managed-node3 41445 1727204211.65773: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204211.65966: Calling all_plugins_play to load vars for managed-node3 41445 1727204211.65970: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204211.65974: Calling groups_plugins_play to load vars for managed-node3 41445 1727204211.67554: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204211.69239: done with get_vars() 41445 1727204211.69271: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:56:51 -0400 (0:00:00.858) 0:00:30.481 ***** 41445 1727204211.69367: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_state 41445 1727204211.69732: worker is 1 (out of 1 available) 41445 1727204211.69745: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_state 41445 1727204211.69756: done queuing things up, now waiting for results queue to drain 41445 1727204211.69758: waiting for pending results... 41445 1727204211.70048: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state 41445 1727204211.70158: in run() - task 028d2410-947f-bf02-eee4-0000000000c8 41445 1727204211.70182: variable 'ansible_search_path' from source: unknown 41445 1727204211.70190: variable 'ansible_search_path' from source: unknown 41445 1727204211.70240: calling self._execute() 41445 1727204211.70360: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204211.70371: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204211.70389: variable 'omit' from source: magic vars 41445 1727204211.70807: variable 'ansible_distribution_major_version' from source: facts 41445 1727204211.70830: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204211.70974: variable 'network_state' from source: role '' defaults 41445 1727204211.71091: Evaluated conditional (network_state != {}): False 41445 1727204211.71095: when evaluation is False, skipping this task 41445 1727204211.71097: _execute() done 41445 1727204211.71100: dumping result to json 41445 1727204211.71104: done dumping result, returning 41445 1727204211.71107: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state [028d2410-947f-bf02-eee4-0000000000c8] 41445 1727204211.71109: sending task result for task 028d2410-947f-bf02-eee4-0000000000c8 41445 1727204211.71178: done sending task result for task 028d2410-947f-bf02-eee4-0000000000c8 41445 1727204211.71182: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 41445 1727204211.71240: no more pending results, returning what we have 41445 1727204211.71245: results queue empty 41445 1727204211.71246: checking for any_errors_fatal 41445 1727204211.71259: done checking for any_errors_fatal 41445 1727204211.71260: checking for max_fail_percentage 41445 1727204211.71262: done checking for max_fail_percentage 41445 1727204211.71262: checking to see if all hosts have failed and the running result is not ok 41445 1727204211.71263: done checking to see if all hosts have failed 41445 1727204211.71264: getting the remaining hosts for this loop 41445 1727204211.71266: done getting the remaining hosts for this loop 41445 1727204211.71270: getting the next task for host managed-node3 41445 1727204211.71279: done getting next task for host managed-node3 41445 1727204211.71282: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 41445 1727204211.71285: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204211.71302: getting variables 41445 1727204211.71304: in VariableManager get_vars() 41445 1727204211.71346: Calling all_inventory to load vars for managed-node3 41445 1727204211.71349: Calling groups_inventory to load vars for managed-node3 41445 1727204211.71352: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204211.71366: Calling all_plugins_play to load vars for managed-node3 41445 1727204211.71370: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204211.71373: Calling groups_plugins_play to load vars for managed-node3 41445 1727204211.73077: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204211.74890: done with get_vars() 41445 1727204211.74921: done getting variables 41445 1727204211.74988: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:56:51 -0400 (0:00:00.056) 0:00:30.537 ***** 41445 1727204211.75021: entering _queue_task() for managed-node3/debug 41445 1727204211.75381: worker is 1 (out of 1 available) 41445 1727204211.75486: exiting _queue_task() for managed-node3/debug 41445 1727204211.75500: done queuing things up, now waiting for results queue to drain 41445 1727204211.75501: waiting for pending results... 41445 1727204211.75796: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 41445 1727204211.75824: in run() - task 028d2410-947f-bf02-eee4-0000000000c9 41445 1727204211.75849: variable 'ansible_search_path' from source: unknown 41445 1727204211.75856: variable 'ansible_search_path' from source: unknown 41445 1727204211.75907: calling self._execute() 41445 1727204211.76045: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204211.76048: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204211.76050: variable 'omit' from source: magic vars 41445 1727204211.76470: variable 'ansible_distribution_major_version' from source: facts 41445 1727204211.76497: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204211.76546: variable 'omit' from source: magic vars 41445 1727204211.76564: variable 'omit' from source: magic vars 41445 1727204211.76617: variable 'omit' from source: magic vars 41445 1727204211.76669: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41445 1727204211.76722: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41445 1727204211.76748: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41445 1727204211.76782: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204211.76918: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204211.76921: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41445 1727204211.76924: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204211.76926: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204211.77017: Set connection var ansible_shell_executable to /bin/sh 41445 1727204211.77030: Set connection var ansible_shell_type to sh 41445 1727204211.77044: Set connection var ansible_pipelining to False 41445 1727204211.77073: Set connection var ansible_timeout to 10 41445 1727204211.77085: Set connection var ansible_connection to ssh 41445 1727204211.77099: Set connection var ansible_module_compression to ZIP_DEFLATED 41445 1727204211.77157: variable 'ansible_shell_executable' from source: unknown 41445 1727204211.77164: variable 'ansible_connection' from source: unknown 41445 1727204211.77170: variable 'ansible_module_compression' from source: unknown 41445 1727204211.77178: variable 'ansible_shell_type' from source: unknown 41445 1727204211.77242: variable 'ansible_shell_executable' from source: unknown 41445 1727204211.77246: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204211.77248: variable 'ansible_pipelining' from source: unknown 41445 1727204211.77250: variable 'ansible_timeout' from source: unknown 41445 1727204211.77253: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204211.77375: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41445 1727204211.77395: variable 'omit' from source: magic vars 41445 1727204211.77461: starting attempt loop 41445 1727204211.77464: running the handler 41445 1727204211.77591: variable '__network_connections_result' from source: set_fact 41445 1727204211.77652: handler run complete 41445 1727204211.77682: attempt loop complete, returning result 41445 1727204211.77694: _execute() done 41445 1727204211.77701: dumping result to json 41445 1727204211.77713: done dumping result, returning 41445 1727204211.77727: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [028d2410-947f-bf02-eee4-0000000000c9] 41445 1727204211.77737: sending task result for task 028d2410-947f-bf02-eee4-0000000000c9 41445 1727204211.77859: done sending task result for task 028d2410-947f-bf02-eee4-0000000000c9 41445 1727204211.77862: WORKER PROCESS EXITING ok: [managed-node3] => { "__network_connections_result.stderr_lines": [ "" ] } 41445 1727204211.78031: no more pending results, returning what we have 41445 1727204211.78035: results queue empty 41445 1727204211.78036: checking for any_errors_fatal 41445 1727204211.78044: done checking for any_errors_fatal 41445 1727204211.78045: checking for max_fail_percentage 41445 1727204211.78047: done checking for max_fail_percentage 41445 1727204211.78048: checking to see if all hosts have failed and the running result is not ok 41445 1727204211.78049: done checking to see if all hosts have failed 41445 1727204211.78049: getting the remaining hosts for this loop 41445 1727204211.78052: done getting the remaining hosts for this loop 41445 1727204211.78056: getting the next task for host managed-node3 41445 1727204211.78063: done getting next task for host managed-node3 41445 1727204211.78068: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 41445 1727204211.78071: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204211.78083: getting variables 41445 1727204211.78085: in VariableManager get_vars() 41445 1727204211.78240: Calling all_inventory to load vars for managed-node3 41445 1727204211.78244: Calling groups_inventory to load vars for managed-node3 41445 1727204211.78246: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204211.78257: Calling all_plugins_play to load vars for managed-node3 41445 1727204211.78260: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204211.78263: Calling groups_plugins_play to load vars for managed-node3 41445 1727204211.80085: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204211.81835: done with get_vars() 41445 1727204211.81862: done getting variables 41445 1727204211.82044: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:56:51 -0400 (0:00:00.070) 0:00:30.608 ***** 41445 1727204211.82071: entering _queue_task() for managed-node3/debug 41445 1727204211.83030: worker is 1 (out of 1 available) 41445 1727204211.83043: exiting _queue_task() for managed-node3/debug 41445 1727204211.83054: done queuing things up, now waiting for results queue to drain 41445 1727204211.83055: waiting for pending results... 41445 1727204211.83896: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 41445 1727204211.83902: in run() - task 028d2410-947f-bf02-eee4-0000000000ca 41445 1727204211.83906: variable 'ansible_search_path' from source: unknown 41445 1727204211.83909: variable 'ansible_search_path' from source: unknown 41445 1727204211.83934: calling self._execute() 41445 1727204211.84156: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204211.84166: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204211.84185: variable 'omit' from source: magic vars 41445 1727204211.84969: variable 'ansible_distribution_major_version' from source: facts 41445 1727204211.85182: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204211.85186: variable 'omit' from source: magic vars 41445 1727204211.85188: variable 'omit' from source: magic vars 41445 1727204211.85190: variable 'omit' from source: magic vars 41445 1727204211.85332: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41445 1727204211.85371: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41445 1727204211.85508: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41445 1727204211.85516: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204211.85533: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204211.85564: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41445 1727204211.85573: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204211.85623: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204211.85881: Set connection var ansible_shell_executable to /bin/sh 41445 1727204211.85884: Set connection var ansible_shell_type to sh 41445 1727204211.85886: Set connection var ansible_pipelining to False 41445 1727204211.85888: Set connection var ansible_timeout to 10 41445 1727204211.85890: Set connection var ansible_connection to ssh 41445 1727204211.85893: Set connection var ansible_module_compression to ZIP_DEFLATED 41445 1727204211.85958: variable 'ansible_shell_executable' from source: unknown 41445 1727204211.85967: variable 'ansible_connection' from source: unknown 41445 1727204211.85974: variable 'ansible_module_compression' from source: unknown 41445 1727204211.85984: variable 'ansible_shell_type' from source: unknown 41445 1727204211.85990: variable 'ansible_shell_executable' from source: unknown 41445 1727204211.85997: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204211.86004: variable 'ansible_pipelining' from source: unknown 41445 1727204211.86014: variable 'ansible_timeout' from source: unknown 41445 1727204211.86270: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204211.86323: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41445 1727204211.86392: variable 'omit' from source: magic vars 41445 1727204211.86402: starting attempt loop 41445 1727204211.86409: running the handler 41445 1727204211.86465: variable '__network_connections_result' from source: set_fact 41445 1727204211.86671: variable '__network_connections_result' from source: set_fact 41445 1727204211.86881: handler run complete 41445 1727204211.86940: attempt loop complete, returning result 41445 1727204211.87030: _execute() done 41445 1727204211.87038: dumping result to json 41445 1727204211.87047: done dumping result, returning 41445 1727204211.87059: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [028d2410-947f-bf02-eee4-0000000000ca] 41445 1727204211.87069: sending task result for task 028d2410-947f-bf02-eee4-0000000000ca ok: [managed-node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 41445 1727204211.87258: no more pending results, returning what we have 41445 1727204211.87263: results queue empty 41445 1727204211.87264: checking for any_errors_fatal 41445 1727204211.87271: done checking for any_errors_fatal 41445 1727204211.87271: checking for max_fail_percentage 41445 1727204211.87273: done checking for max_fail_percentage 41445 1727204211.87274: checking to see if all hosts have failed and the running result is not ok 41445 1727204211.87277: done checking to see if all hosts have failed 41445 1727204211.87277: getting the remaining hosts for this loop 41445 1727204211.87279: done getting the remaining hosts for this loop 41445 1727204211.87283: getting the next task for host managed-node3 41445 1727204211.87289: done getting next task for host managed-node3 41445 1727204211.87293: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 41445 1727204211.87295: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204211.87305: getting variables 41445 1727204211.87307: in VariableManager get_vars() 41445 1727204211.87349: Calling all_inventory to load vars for managed-node3 41445 1727204211.87352: Calling groups_inventory to load vars for managed-node3 41445 1727204211.87354: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204211.87366: Calling all_plugins_play to load vars for managed-node3 41445 1727204211.87369: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204211.87372: Calling groups_plugins_play to load vars for managed-node3 41445 1727204211.88390: done sending task result for task 028d2410-947f-bf02-eee4-0000000000ca 41445 1727204211.88393: WORKER PROCESS EXITING 41445 1727204211.90447: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204211.92409: done with get_vars() 41445 1727204211.92433: done getting variables 41445 1727204211.92494: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:56:51 -0400 (0:00:00.104) 0:00:30.713 ***** 41445 1727204211.92541: entering _queue_task() for managed-node3/debug 41445 1727204211.92927: worker is 1 (out of 1 available) 41445 1727204211.92941: exiting _queue_task() for managed-node3/debug 41445 1727204211.92953: done queuing things up, now waiting for results queue to drain 41445 1727204211.92955: waiting for pending results... 41445 1727204211.93250: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 41445 1727204211.93372: in run() - task 028d2410-947f-bf02-eee4-0000000000cb 41445 1727204211.93398: variable 'ansible_search_path' from source: unknown 41445 1727204211.93415: variable 'ansible_search_path' from source: unknown 41445 1727204211.93455: calling self._execute() 41445 1727204211.93628: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204211.93632: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204211.93635: variable 'omit' from source: magic vars 41445 1727204211.94033: variable 'ansible_distribution_major_version' from source: facts 41445 1727204211.94052: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204211.94200: variable 'network_state' from source: role '' defaults 41445 1727204211.94220: Evaluated conditional (network_state != {}): False 41445 1727204211.94227: when evaluation is False, skipping this task 41445 1727204211.94234: _execute() done 41445 1727204211.94242: dumping result to json 41445 1727204211.94284: done dumping result, returning 41445 1727204211.94288: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [028d2410-947f-bf02-eee4-0000000000cb] 41445 1727204211.94290: sending task result for task 028d2410-947f-bf02-eee4-0000000000cb skipping: [managed-node3] => { "false_condition": "network_state != {}" } 41445 1727204211.94438: no more pending results, returning what we have 41445 1727204211.94442: results queue empty 41445 1727204211.94444: checking for any_errors_fatal 41445 1727204211.94456: done checking for any_errors_fatal 41445 1727204211.94457: checking for max_fail_percentage 41445 1727204211.94459: done checking for max_fail_percentage 41445 1727204211.94460: checking to see if all hosts have failed and the running result is not ok 41445 1727204211.94461: done checking to see if all hosts have failed 41445 1727204211.94462: getting the remaining hosts for this loop 41445 1727204211.94464: done getting the remaining hosts for this loop 41445 1727204211.94467: getting the next task for host managed-node3 41445 1727204211.94475: done getting next task for host managed-node3 41445 1727204211.94481: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 41445 1727204211.94484: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204211.94500: getting variables 41445 1727204211.94502: in VariableManager get_vars() 41445 1727204211.94543: Calling all_inventory to load vars for managed-node3 41445 1727204211.94546: Calling groups_inventory to load vars for managed-node3 41445 1727204211.94549: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204211.94562: Calling all_plugins_play to load vars for managed-node3 41445 1727204211.94566: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204211.94569: Calling groups_plugins_play to load vars for managed-node3 41445 1727204211.95414: done sending task result for task 028d2410-947f-bf02-eee4-0000000000cb 41445 1727204211.95418: WORKER PROCESS EXITING 41445 1727204211.96298: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204211.97942: done with get_vars() 41445 1727204211.97970: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:56:51 -0400 (0:00:00.055) 0:00:30.768 ***** 41445 1727204211.98070: entering _queue_task() for managed-node3/ping 41445 1727204211.98525: worker is 1 (out of 1 available) 41445 1727204211.98537: exiting _queue_task() for managed-node3/ping 41445 1727204211.98548: done queuing things up, now waiting for results queue to drain 41445 1727204211.98549: waiting for pending results... 41445 1727204211.98837: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 41445 1727204211.98971: in run() - task 028d2410-947f-bf02-eee4-0000000000cc 41445 1727204211.98995: variable 'ansible_search_path' from source: unknown 41445 1727204211.99003: variable 'ansible_search_path' from source: unknown 41445 1727204211.99052: calling self._execute() 41445 1727204211.99162: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204211.99172: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204211.99188: variable 'omit' from source: magic vars 41445 1727204211.99616: variable 'ansible_distribution_major_version' from source: facts 41445 1727204211.99634: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204211.99644: variable 'omit' from source: magic vars 41445 1727204211.99686: variable 'omit' from source: magic vars 41445 1727204211.99733: variable 'omit' from source: magic vars 41445 1727204211.99791: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41445 1727204211.99838: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41445 1727204211.99899: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41445 1727204211.99907: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204211.99913: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204211.99943: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41445 1727204211.99950: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204211.99956: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204212.00059: Set connection var ansible_shell_executable to /bin/sh 41445 1727204212.00066: Set connection var ansible_shell_type to sh 41445 1727204212.00118: Set connection var ansible_pipelining to False 41445 1727204212.00123: Set connection var ansible_timeout to 10 41445 1727204212.00125: Set connection var ansible_connection to ssh 41445 1727204212.00127: Set connection var ansible_module_compression to ZIP_DEFLATED 41445 1727204212.00188: variable 'ansible_shell_executable' from source: unknown 41445 1727204212.00191: variable 'ansible_connection' from source: unknown 41445 1727204212.00194: variable 'ansible_module_compression' from source: unknown 41445 1727204212.00196: variable 'ansible_shell_type' from source: unknown 41445 1727204212.00198: variable 'ansible_shell_executable' from source: unknown 41445 1727204212.00200: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204212.00202: variable 'ansible_pipelining' from source: unknown 41445 1727204212.00204: variable 'ansible_timeout' from source: unknown 41445 1727204212.00206: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204212.00459: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 41445 1727204212.00468: variable 'omit' from source: magic vars 41445 1727204212.00470: starting attempt loop 41445 1727204212.00472: running the handler 41445 1727204212.00491: _low_level_execute_command(): starting 41445 1727204212.00533: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41445 1727204212.01316: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204212.01371: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204212.01437: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204212.01446: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204212.01474: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204212.01607: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204212.03215: stdout chunk (state=3): >>>/root <<< 41445 1727204212.03543: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204212.03546: stdout chunk (state=3): >>><<< 41445 1727204212.03548: stderr chunk (state=3): >>><<< 41445 1727204212.03551: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204212.03553: _low_level_execute_command(): starting 41445 1727204212.03556: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204212.0349777-43280-163326513083816 `" && echo ansible-tmp-1727204212.0349777-43280-163326513083816="` echo /root/.ansible/tmp/ansible-tmp-1727204212.0349777-43280-163326513083816 `" ) && sleep 0' 41445 1727204212.04635: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204212.04648: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204212.04688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204212.04715: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204212.04728: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 <<< 41445 1727204212.04750: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204212.04827: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204212.04847: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204212.04962: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204212.06834: stdout chunk (state=3): >>>ansible-tmp-1727204212.0349777-43280-163326513083816=/root/.ansible/tmp/ansible-tmp-1727204212.0349777-43280-163326513083816 <<< 41445 1727204212.06978: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204212.07282: stdout chunk (state=3): >>><<< 41445 1727204212.07286: stderr chunk (state=3): >>><<< 41445 1727204212.07288: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204212.0349777-43280-163326513083816=/root/.ansible/tmp/ansible-tmp-1727204212.0349777-43280-163326513083816 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204212.07291: variable 'ansible_module_compression' from source: unknown 41445 1727204212.07293: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-414450s0ylvj1/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 41445 1727204212.07371: variable 'ansible_facts' from source: unknown 41445 1727204212.07468: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204212.0349777-43280-163326513083816/AnsiballZ_ping.py 41445 1727204212.07645: Sending initial data 41445 1727204212.07655: Sent initial data (153 bytes) 41445 1727204212.08362: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204212.08381: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204212.08406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204212.08520: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204212.08555: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204212.08628: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204212.10140: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41445 1727204212.10256: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41445 1727204212.10292: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-414450s0ylvj1/tmpg6el565p /root/.ansible/tmp/ansible-tmp-1727204212.0349777-43280-163326513083816/AnsiballZ_ping.py <<< 41445 1727204212.10388: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204212.0349777-43280-163326513083816/AnsiballZ_ping.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-414450s0ylvj1/tmpg6el565p" to remote "/root/.ansible/tmp/ansible-tmp-1727204212.0349777-43280-163326513083816/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204212.0349777-43280-163326513083816/AnsiballZ_ping.py" <<< 41445 1727204212.11581: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204212.11585: stdout chunk (state=3): >>><<< 41445 1727204212.11587: stderr chunk (state=3): >>><<< 41445 1727204212.11589: done transferring module to remote 41445 1727204212.11591: _low_level_execute_command(): starting 41445 1727204212.11593: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204212.0349777-43280-163326513083816/ /root/.ansible/tmp/ansible-tmp-1727204212.0349777-43280-163326513083816/AnsiballZ_ping.py && sleep 0' 41445 1727204212.12892: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204212.12916: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204212.13000: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204212.13100: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204212.13130: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204212.13174: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204212.15083: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204212.15087: stderr chunk (state=3): >>><<< 41445 1727204212.15089: stdout chunk (state=3): >>><<< 41445 1727204212.15195: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204212.15199: _low_level_execute_command(): starting 41445 1727204212.15203: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204212.0349777-43280-163326513083816/AnsiballZ_ping.py && sleep 0' 41445 1727204212.15939: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204212.15953: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204212.15965: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204212.16067: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204212.16095: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204212.16168: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204212.30749: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 41445 1727204212.31951: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. <<< 41445 1727204212.31980: stderr chunk (state=3): >>><<< 41445 1727204212.31988: stdout chunk (state=3): >>><<< 41445 1727204212.32003: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. 41445 1727204212.32025: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204212.0349777-43280-163326513083816/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41445 1727204212.32033: _low_level_execute_command(): starting 41445 1727204212.32037: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204212.0349777-43280-163326513083816/ > /dev/null 2>&1 && sleep 0' 41445 1727204212.32449: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204212.32486: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204212.32489: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found <<< 41445 1727204212.32491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204212.32493: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204212.32495: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found <<< 41445 1727204212.32497: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204212.32551: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204212.32559: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204212.32563: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204212.32598: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204212.34417: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204212.34420: stderr chunk (state=3): >>><<< 41445 1727204212.34423: stdout chunk (state=3): >>><<< 41445 1727204212.34681: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204212.34688: handler run complete 41445 1727204212.34690: attempt loop complete, returning result 41445 1727204212.34692: _execute() done 41445 1727204212.34694: dumping result to json 41445 1727204212.34696: done dumping result, returning 41445 1727204212.34698: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [028d2410-947f-bf02-eee4-0000000000cc] 41445 1727204212.34699: sending task result for task 028d2410-947f-bf02-eee4-0000000000cc 41445 1727204212.34765: done sending task result for task 028d2410-947f-bf02-eee4-0000000000cc 41445 1727204212.34768: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "ping": "pong" } 41445 1727204212.34830: no more pending results, returning what we have 41445 1727204212.34833: results queue empty 41445 1727204212.34834: checking for any_errors_fatal 41445 1727204212.34839: done checking for any_errors_fatal 41445 1727204212.34839: checking for max_fail_percentage 41445 1727204212.34841: done checking for max_fail_percentage 41445 1727204212.34842: checking to see if all hosts have failed and the running result is not ok 41445 1727204212.34843: done checking to see if all hosts have failed 41445 1727204212.34843: getting the remaining hosts for this loop 41445 1727204212.34845: done getting the remaining hosts for this loop 41445 1727204212.34848: getting the next task for host managed-node3 41445 1727204212.34855: done getting next task for host managed-node3 41445 1727204212.34858: ^ task is: TASK: meta (role_complete) 41445 1727204212.34860: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204212.34869: getting variables 41445 1727204212.34871: in VariableManager get_vars() 41445 1727204212.34912: Calling all_inventory to load vars for managed-node3 41445 1727204212.34915: Calling groups_inventory to load vars for managed-node3 41445 1727204212.34917: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204212.34928: Calling all_plugins_play to load vars for managed-node3 41445 1727204212.34931: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204212.34934: Calling groups_plugins_play to load vars for managed-node3 41445 1727204212.41581: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204212.42654: done with get_vars() 41445 1727204212.42673: done getting variables 41445 1727204212.42729: done queuing things up, now waiting for results queue to drain 41445 1727204212.42730: results queue empty 41445 1727204212.42731: checking for any_errors_fatal 41445 1727204212.42733: done checking for any_errors_fatal 41445 1727204212.42733: checking for max_fail_percentage 41445 1727204212.42734: done checking for max_fail_percentage 41445 1727204212.42734: checking to see if all hosts have failed and the running result is not ok 41445 1727204212.42735: done checking to see if all hosts have failed 41445 1727204212.42735: getting the remaining hosts for this loop 41445 1727204212.42736: done getting the remaining hosts for this loop 41445 1727204212.42738: getting the next task for host managed-node3 41445 1727204212.42740: done getting next task for host managed-node3 41445 1727204212.42741: ^ task is: TASK: meta (flush_handlers) 41445 1727204212.42742: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204212.42744: getting variables 41445 1727204212.42745: in VariableManager get_vars() 41445 1727204212.42753: Calling all_inventory to load vars for managed-node3 41445 1727204212.42755: Calling groups_inventory to load vars for managed-node3 41445 1727204212.42756: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204212.42759: Calling all_plugins_play to load vars for managed-node3 41445 1727204212.42760: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204212.42762: Calling groups_plugins_play to load vars for managed-node3 41445 1727204212.43411: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204212.44679: done with get_vars() 41445 1727204212.44700: done getting variables 41445 1727204212.44751: in VariableManager get_vars() 41445 1727204212.44762: Calling all_inventory to load vars for managed-node3 41445 1727204212.44765: Calling groups_inventory to load vars for managed-node3 41445 1727204212.44767: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204212.44771: Calling all_plugins_play to load vars for managed-node3 41445 1727204212.44774: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204212.44780: Calling groups_plugins_play to load vars for managed-node3 41445 1727204212.45581: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204212.46452: done with get_vars() 41445 1727204212.46472: done queuing things up, now waiting for results queue to drain 41445 1727204212.46474: results queue empty 41445 1727204212.46474: checking for any_errors_fatal 41445 1727204212.46477: done checking for any_errors_fatal 41445 1727204212.46477: checking for max_fail_percentage 41445 1727204212.46478: done checking for max_fail_percentage 41445 1727204212.46478: checking to see if all hosts have failed and the running result is not ok 41445 1727204212.46479: done checking to see if all hosts have failed 41445 1727204212.46480: getting the remaining hosts for this loop 41445 1727204212.46480: done getting the remaining hosts for this loop 41445 1727204212.46482: getting the next task for host managed-node3 41445 1727204212.46485: done getting next task for host managed-node3 41445 1727204212.46485: ^ task is: TASK: meta (flush_handlers) 41445 1727204212.46486: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204212.46488: getting variables 41445 1727204212.46489: in VariableManager get_vars() 41445 1727204212.46495: Calling all_inventory to load vars for managed-node3 41445 1727204212.46497: Calling groups_inventory to load vars for managed-node3 41445 1727204212.46498: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204212.46501: Calling all_plugins_play to load vars for managed-node3 41445 1727204212.46503: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204212.46504: Calling groups_plugins_play to load vars for managed-node3 41445 1727204212.47458: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204212.49025: done with get_vars() 41445 1727204212.49046: done getting variables 41445 1727204212.49098: in VariableManager get_vars() 41445 1727204212.49113: Calling all_inventory to load vars for managed-node3 41445 1727204212.49115: Calling groups_inventory to load vars for managed-node3 41445 1727204212.49117: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204212.49122: Calling all_plugins_play to load vars for managed-node3 41445 1727204212.49124: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204212.49127: Calling groups_plugins_play to load vars for managed-node3 41445 1727204212.50324: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204212.51880: done with get_vars() 41445 1727204212.51913: done queuing things up, now waiting for results queue to drain 41445 1727204212.51916: results queue empty 41445 1727204212.51917: checking for any_errors_fatal 41445 1727204212.51918: done checking for any_errors_fatal 41445 1727204212.51919: checking for max_fail_percentage 41445 1727204212.51920: done checking for max_fail_percentage 41445 1727204212.51920: checking to see if all hosts have failed and the running result is not ok 41445 1727204212.51921: done checking to see if all hosts have failed 41445 1727204212.51922: getting the remaining hosts for this loop 41445 1727204212.51923: done getting the remaining hosts for this loop 41445 1727204212.51926: getting the next task for host managed-node3 41445 1727204212.51929: done getting next task for host managed-node3 41445 1727204212.51930: ^ task is: None 41445 1727204212.51931: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204212.51932: done queuing things up, now waiting for results queue to drain 41445 1727204212.51933: results queue empty 41445 1727204212.51934: checking for any_errors_fatal 41445 1727204212.51935: done checking for any_errors_fatal 41445 1727204212.51935: checking for max_fail_percentage 41445 1727204212.51936: done checking for max_fail_percentage 41445 1727204212.51937: checking to see if all hosts have failed and the running result is not ok 41445 1727204212.51938: done checking to see if all hosts have failed 41445 1727204212.51939: getting the next task for host managed-node3 41445 1727204212.51941: done getting next task for host managed-node3 41445 1727204212.51942: ^ task is: None 41445 1727204212.51943: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204212.51983: in VariableManager get_vars() 41445 1727204212.51998: done with get_vars() 41445 1727204212.52004: in VariableManager get_vars() 41445 1727204212.52018: done with get_vars() 41445 1727204212.52023: variable 'omit' from source: magic vars 41445 1727204212.52052: in VariableManager get_vars() 41445 1727204212.52063: done with get_vars() 41445 1727204212.52084: variable 'omit' from source: magic vars PLAY [Delete the interface] **************************************************** 41445 1727204212.52254: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 41445 1727204212.52284: getting the remaining hosts for this loop 41445 1727204212.52286: done getting the remaining hosts for this loop 41445 1727204212.52288: getting the next task for host managed-node3 41445 1727204212.52291: done getting next task for host managed-node3 41445 1727204212.52293: ^ task is: TASK: Gathering Facts 41445 1727204212.52294: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204212.52296: getting variables 41445 1727204212.52297: in VariableManager get_vars() 41445 1727204212.52306: Calling all_inventory to load vars for managed-node3 41445 1727204212.52308: Calling groups_inventory to load vars for managed-node3 41445 1727204212.52313: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204212.52318: Calling all_plugins_play to load vars for managed-node3 41445 1727204212.52321: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204212.52324: Calling groups_plugins_play to load vars for managed-node3 41445 1727204212.53556: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204212.55107: done with get_vars() 41445 1727204212.55129: done getting variables 41445 1727204212.55171: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:5 Tuesday 24 September 2024 14:56:52 -0400 (0:00:00.571) 0:00:31.339 ***** 41445 1727204212.55195: entering _queue_task() for managed-node3/gather_facts 41445 1727204212.55541: worker is 1 (out of 1 available) 41445 1727204212.55553: exiting _queue_task() for managed-node3/gather_facts 41445 1727204212.55563: done queuing things up, now waiting for results queue to drain 41445 1727204212.55565: waiting for pending results... 41445 1727204212.55839: running TaskExecutor() for managed-node3/TASK: Gathering Facts 41445 1727204212.55909: in run() - task 028d2410-947f-bf02-eee4-00000000076f 41445 1727204212.55940: variable 'ansible_search_path' from source: unknown 41445 1727204212.55993: calling self._execute() 41445 1727204212.56281: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204212.56285: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204212.56287: variable 'omit' from source: magic vars 41445 1727204212.56546: variable 'ansible_distribution_major_version' from source: facts 41445 1727204212.56585: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204212.56596: variable 'omit' from source: magic vars 41445 1727204212.56644: variable 'omit' from source: magic vars 41445 1727204212.56685: variable 'omit' from source: magic vars 41445 1727204212.56761: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41445 1727204212.56827: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41445 1727204212.56869: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41445 1727204212.56895: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204212.56945: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204212.56993: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41445 1727204212.57002: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204212.57011: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204212.57168: Set connection var ansible_shell_executable to /bin/sh 41445 1727204212.57172: Set connection var ansible_shell_type to sh 41445 1727204212.57174: Set connection var ansible_pipelining to False 41445 1727204212.57180: Set connection var ansible_timeout to 10 41445 1727204212.57182: Set connection var ansible_connection to ssh 41445 1727204212.57185: Set connection var ansible_module_compression to ZIP_DEFLATED 41445 1727204212.57203: variable 'ansible_shell_executable' from source: unknown 41445 1727204212.57212: variable 'ansible_connection' from source: unknown 41445 1727204212.57220: variable 'ansible_module_compression' from source: unknown 41445 1727204212.57227: variable 'ansible_shell_type' from source: unknown 41445 1727204212.57234: variable 'ansible_shell_executable' from source: unknown 41445 1727204212.57280: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204212.57283: variable 'ansible_pipelining' from source: unknown 41445 1727204212.57285: variable 'ansible_timeout' from source: unknown 41445 1727204212.57287: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204212.57451: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41445 1727204212.57466: variable 'omit' from source: magic vars 41445 1727204212.57475: starting attempt loop 41445 1727204212.57494: running the handler 41445 1727204212.57603: variable 'ansible_facts' from source: unknown 41445 1727204212.57607: _low_level_execute_command(): starting 41445 1727204212.57609: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41445 1727204212.58186: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204212.58209: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204212.58262: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204212.58284: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204212.58312: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204212.59943: stdout chunk (state=3): >>>/root <<< 41445 1727204212.60187: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204212.60191: stdout chunk (state=3): >>><<< 41445 1727204212.60194: stderr chunk (state=3): >>><<< 41445 1727204212.60198: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204212.60201: _low_level_execute_command(): starting 41445 1727204212.60205: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204212.6010334-43314-239380082206544 `" && echo ansible-tmp-1727204212.6010334-43314-239380082206544="` echo /root/.ansible/tmp/ansible-tmp-1727204212.6010334-43314-239380082206544 `" ) && sleep 0' 41445 1727204212.60725: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204212.60729: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204212.60732: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204212.60741: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204212.60800: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204212.60803: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204212.60838: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204212.62673: stdout chunk (state=3): >>>ansible-tmp-1727204212.6010334-43314-239380082206544=/root/.ansible/tmp/ansible-tmp-1727204212.6010334-43314-239380082206544 <<< 41445 1727204212.62844: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204212.62847: stdout chunk (state=3): >>><<< 41445 1727204212.62849: stderr chunk (state=3): >>><<< 41445 1727204212.62864: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204212.6010334-43314-239380082206544=/root/.ansible/tmp/ansible-tmp-1727204212.6010334-43314-239380082206544 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204212.63084: variable 'ansible_module_compression' from source: unknown 41445 1727204212.63087: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-414450s0ylvj1/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 41445 1727204212.63089: variable 'ansible_facts' from source: unknown 41445 1727204212.63259: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204212.6010334-43314-239380082206544/AnsiballZ_setup.py 41445 1727204212.63432: Sending initial data 41445 1727204212.63442: Sent initial data (154 bytes) 41445 1727204212.64021: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204212.64048: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found <<< 41445 1727204212.64060: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204212.64100: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204212.64113: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204212.64156: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204212.65728: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41445 1727204212.65765: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41445 1727204212.65815: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-414450s0ylvj1/tmpk__6w_0h /root/.ansible/tmp/ansible-tmp-1727204212.6010334-43314-239380082206544/AnsiballZ_setup.py <<< 41445 1727204212.65818: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204212.6010334-43314-239380082206544/AnsiballZ_setup.py" <<< 41445 1727204212.65847: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-414450s0ylvj1/tmpk__6w_0h" to remote "/root/.ansible/tmp/ansible-tmp-1727204212.6010334-43314-239380082206544/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204212.6010334-43314-239380082206544/AnsiballZ_setup.py" <<< 41445 1727204212.67135: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204212.67281: stderr chunk (state=3): >>><<< 41445 1727204212.67285: stdout chunk (state=3): >>><<< 41445 1727204212.67287: done transferring module to remote 41445 1727204212.67291: _low_level_execute_command(): starting 41445 1727204212.67294: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204212.6010334-43314-239380082206544/ /root/.ansible/tmp/ansible-tmp-1727204212.6010334-43314-239380082206544/AnsiballZ_setup.py && sleep 0' 41445 1727204212.67725: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204212.67749: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204212.67792: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204212.67804: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204212.67848: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204212.69625: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204212.69628: stdout chunk (state=3): >>><<< 41445 1727204212.69631: stderr chunk (state=3): >>><<< 41445 1727204212.69646: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204212.69654: _low_level_execute_command(): starting 41445 1727204212.69734: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204212.6010334-43314-239380082206544/AnsiballZ_setup.py && sleep 0' 41445 1727204212.70189: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204212.70206: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204212.70221: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204212.70267: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204212.70286: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204212.70321: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204213.35274: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "56", "second": "52", "epoch": "1727204212", "epoch_int": "1727204212", "date": "2024-09-24", "time": "14:56:52", "iso8601_micro": "2024-09-24T18:56:52.965032Z", "iso8601": "2024-09-24T18:56:52Z", "iso8601_basic": "20240924T145652965032", "iso8601_basic_short": "20240924T145652", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec25272c602494034078bc876e25857f", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC83yKiaGAYjfqsqlfpPMYFAYI2IZVpX8WgNfvPCdI/QOKjuzs4b6SbV/Bm0ogtp9Et9STTGoHBvp3tMYQ6i0y/9DHMBxtiHYJ+rIzJ/YNNMqKc+fMqexyOsi7rKKuzfxXrYU8uPBHq5WU3WAZmJxJn+UHZGog8hUnJ8momdJG+aYo9El3Qce4gVdwORcmHZUOa49M8lLCwTovtYArmkGETUVJ+Jk8huVTzYpASWxxcw6zOvUcn52HC6dmNQv/T+k2uW6UW0rybwIrVUlZXRNODrXs8kCGgOx1OI0XYB3FndJOnORF4A9Y6onLo/zUCEaO8Pi19mcfSbo2v+bmotTVk5jcmvR3jhVYJmJE6a+dQpjSZolSqMv8mI9tkztfxM6bJlNSZcTrvZEzu7cbiE38Pp/Ku143n9iGgWNmUQ2FhUDpoWEhoA767VUunE48P8ivpVZ/u5aEyupZSLEuWEvCLmmGPVcf9hVbcXw0n8RFvUwDdD8WARVhN5GJFUVN5JM0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHm1sMDuVWGACN5pHFDkl6tR93F90YCY4cFGcXcCoQnN+oT963FmBwTMMlfDIm4G2OUATCZuz6QFZP9trAaUzXo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIAB71QKijTWbanEvrb0ex0kLr0wX6qyv6naldRWNiIFP", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2945, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 586, "free": 2945}, "nocache": {"free": 3285, "used": 246}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Othe<<< 41445 1727204213.35333: stdout chunk (state=3): >>>r", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec25272c-6024-9403-4078-bc876e25857f", "ansible_product_uuid": "ec25272c-6024-9403-4078-bc876e25857f", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 790, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261788672000, "block_size": 4096, "block_total": 65519099, "block_available": 63913250, "block_used": 1605849, "inode_total": 131070960, "inode_available": 131027340, "inode_used": 43620, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_lsb": {}, "ansible_local": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_iscsi_iqn": "", "ansible_fibre_channel_wwn": [], "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 53526 10.31.47.22 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 53526 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_is_chroot": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_loadavg": {"1m": 0.66943359375, "5m": 0.54541015625, "15m": 0.31982421875}, "ansible_fips": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_interfaces": ["ethtest0", "lo", "peerethtest0", "eth0", "rpltstbr"], "ansible_eth0": {"device": "eth0", "macaddress": "02:83:38:1a:ae:4d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.47.22", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::83:38ff:fe1a:ae4d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_ethtest0": {"device": "ethtest0", "macaddress": "5a:c9:79:b9:fb:44", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::58c9:79ff:feb9:fb44", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerethtest0": {"device": "peerethtest0", "macaddress": "22:cb:5d:bd:5d:c6", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::20cb:5dff:febd:5dc6", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "26:cf:9a:9b:f7:ee", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.47.22", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:83:38:1a:ae:4d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.47.22", "192.0.2.72"], "ansible_all_ipv6_addresses": ["fe80::83:38ff:fe1a:ae4d", "fe80::58c9:79ff:feb9:fb44", "fe80::20cb:5dff:febd:5dc6"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.47.22", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::83:38ff:fe1a:ae4d", "fe80::20cb:5dff:febd:5dc6"]}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 41445 1727204213.37323: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. <<< 41445 1727204213.37328: stdout chunk (state=3): >>><<< 41445 1727204213.37331: stderr chunk (state=3): >>><<< 41445 1727204213.37415: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "56", "second": "52", "epoch": "1727204212", "epoch_int": "1727204212", "date": "2024-09-24", "time": "14:56:52", "iso8601_micro": "2024-09-24T18:56:52.965032Z", "iso8601": "2024-09-24T18:56:52Z", "iso8601_basic": "20240924T145652965032", "iso8601_basic_short": "20240924T145652", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec25272c602494034078bc876e25857f", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC83yKiaGAYjfqsqlfpPMYFAYI2IZVpX8WgNfvPCdI/QOKjuzs4b6SbV/Bm0ogtp9Et9STTGoHBvp3tMYQ6i0y/9DHMBxtiHYJ+rIzJ/YNNMqKc+fMqexyOsi7rKKuzfxXrYU8uPBHq5WU3WAZmJxJn+UHZGog8hUnJ8momdJG+aYo9El3Qce4gVdwORcmHZUOa49M8lLCwTovtYArmkGETUVJ+Jk8huVTzYpASWxxcw6zOvUcn52HC6dmNQv/T+k2uW6UW0rybwIrVUlZXRNODrXs8kCGgOx1OI0XYB3FndJOnORF4A9Y6onLo/zUCEaO8Pi19mcfSbo2v+bmotTVk5jcmvR3jhVYJmJE6a+dQpjSZolSqMv8mI9tkztfxM6bJlNSZcTrvZEzu7cbiE38Pp/Ku143n9iGgWNmUQ2FhUDpoWEhoA767VUunE48P8ivpVZ/u5aEyupZSLEuWEvCLmmGPVcf9hVbcXw0n8RFvUwDdD8WARVhN5GJFUVN5JM0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHm1sMDuVWGACN5pHFDkl6tR93F90YCY4cFGcXcCoQnN+oT963FmBwTMMlfDIm4G2OUATCZuz6QFZP9trAaUzXo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIAB71QKijTWbanEvrb0ex0kLr0wX6qyv6naldRWNiIFP", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2945, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 586, "free": 2945}, "nocache": {"free": 3285, "used": 246}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec25272c-6024-9403-4078-bc876e25857f", "ansible_product_uuid": "ec25272c-6024-9403-4078-bc876e25857f", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 790, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261788672000, "block_size": 4096, "block_total": 65519099, "block_available": 63913250, "block_used": 1605849, "inode_total": 131070960, "inode_available": 131027340, "inode_used": 43620, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_lsb": {}, "ansible_local": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_iscsi_iqn": "", "ansible_fibre_channel_wwn": [], "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 53526 10.31.47.22 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 53526 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_is_chroot": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_loadavg": {"1m": 0.66943359375, "5m": 0.54541015625, "15m": 0.31982421875}, "ansible_fips": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_interfaces": ["ethtest0", "lo", "peerethtest0", "eth0", "rpltstbr"], "ansible_eth0": {"device": "eth0", "macaddress": "02:83:38:1a:ae:4d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.47.22", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::83:38ff:fe1a:ae4d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_ethtest0": {"device": "ethtest0", "macaddress": "5a:c9:79:b9:fb:44", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::58c9:79ff:feb9:fb44", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerethtest0": {"device": "peerethtest0", "macaddress": "22:cb:5d:bd:5d:c6", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::20cb:5dff:febd:5dc6", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "26:cf:9a:9b:f7:ee", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.47.22", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:83:38:1a:ae:4d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.47.22", "192.0.2.72"], "ansible_all_ipv6_addresses": ["fe80::83:38ff:fe1a:ae4d", "fe80::58c9:79ff:feb9:fb44", "fe80::20cb:5dff:febd:5dc6"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.47.22", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::83:38ff:fe1a:ae4d", "fe80::20cb:5dff:febd:5dc6"]}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. 41445 1727204213.38234: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204212.6010334-43314-239380082206544/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41445 1727204213.38319: _low_level_execute_command(): starting 41445 1727204213.38323: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204212.6010334-43314-239380082206544/ > /dev/null 2>&1 && sleep 0' 41445 1727204213.39444: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204213.39494: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found <<< 41445 1727204213.39512: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204213.39595: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204213.39740: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204213.39826: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204213.41541: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204213.41892: stderr chunk (state=3): >>><<< 41445 1727204213.41895: stdout chunk (state=3): >>><<< 41445 1727204213.41898: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204213.41900: handler run complete 41445 1727204213.42415: variable 'ansible_facts' from source: unknown 41445 1727204213.42606: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204213.43247: variable 'ansible_facts' from source: unknown 41445 1727204213.43553: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204213.43981: attempt loop complete, returning result 41445 1727204213.43985: _execute() done 41445 1727204213.43987: dumping result to json 41445 1727204213.43989: done dumping result, returning 41445 1727204213.43990: done running TaskExecutor() for managed-node3/TASK: Gathering Facts [028d2410-947f-bf02-eee4-00000000076f] 41445 1727204213.43997: sending task result for task 028d2410-947f-bf02-eee4-00000000076f ok: [managed-node3] 41445 1727204213.45263: no more pending results, returning what we have 41445 1727204213.45266: results queue empty 41445 1727204213.45267: checking for any_errors_fatal 41445 1727204213.45268: done checking for any_errors_fatal 41445 1727204213.45269: checking for max_fail_percentage 41445 1727204213.45270: done checking for max_fail_percentage 41445 1727204213.45271: checking to see if all hosts have failed and the running result is not ok 41445 1727204213.45272: done checking to see if all hosts have failed 41445 1727204213.45272: getting the remaining hosts for this loop 41445 1727204213.45273: done getting the remaining hosts for this loop 41445 1727204213.45279: getting the next task for host managed-node3 41445 1727204213.45284: done getting next task for host managed-node3 41445 1727204213.45292: ^ task is: TASK: meta (flush_handlers) 41445 1727204213.45296: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204213.45304: getting variables 41445 1727204213.45305: in VariableManager get_vars() 41445 1727204213.45329: Calling all_inventory to load vars for managed-node3 41445 1727204213.45331: Calling groups_inventory to load vars for managed-node3 41445 1727204213.45335: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204213.45341: done sending task result for task 028d2410-947f-bf02-eee4-00000000076f 41445 1727204213.45343: WORKER PROCESS EXITING 41445 1727204213.45353: Calling all_plugins_play to load vars for managed-node3 41445 1727204213.45355: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204213.45358: Calling groups_plugins_play to load vars for managed-node3 41445 1727204213.48096: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204213.51578: done with get_vars() 41445 1727204213.51603: done getting variables 41445 1727204213.51669: in VariableManager get_vars() 41445 1727204213.51884: Calling all_inventory to load vars for managed-node3 41445 1727204213.51886: Calling groups_inventory to load vars for managed-node3 41445 1727204213.51889: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204213.51894: Calling all_plugins_play to load vars for managed-node3 41445 1727204213.51897: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204213.51899: Calling groups_plugins_play to load vars for managed-node3 41445 1727204213.54568: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204213.58166: done with get_vars() 41445 1727204213.58200: done queuing things up, now waiting for results queue to drain 41445 1727204213.58203: results queue empty 41445 1727204213.58203: checking for any_errors_fatal 41445 1727204213.58207: done checking for any_errors_fatal 41445 1727204213.58208: checking for max_fail_percentage 41445 1727204213.58209: done checking for max_fail_percentage 41445 1727204213.58210: checking to see if all hosts have failed and the running result is not ok 41445 1727204213.58215: done checking to see if all hosts have failed 41445 1727204213.58216: getting the remaining hosts for this loop 41445 1727204213.58217: done getting the remaining hosts for this loop 41445 1727204213.58221: getting the next task for host managed-node3 41445 1727204213.58224: done getting next task for host managed-node3 41445 1727204213.58226: ^ task is: TASK: Include the task 'delete_interface.yml' 41445 1727204213.58228: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204213.58230: getting variables 41445 1727204213.58231: in VariableManager get_vars() 41445 1727204213.58241: Calling all_inventory to load vars for managed-node3 41445 1727204213.58243: Calling groups_inventory to load vars for managed-node3 41445 1727204213.58245: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204213.58251: Calling all_plugins_play to load vars for managed-node3 41445 1727204213.58254: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204213.58256: Calling groups_plugins_play to load vars for managed-node3 41445 1727204213.60882: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204213.64092: done with get_vars() 41445 1727204213.64117: done getting variables TASK [Include the task 'delete_interface.yml'] ********************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:8 Tuesday 24 September 2024 14:56:53 -0400 (0:00:01.089) 0:00:32.429 ***** 41445 1727204213.64194: entering _queue_task() for managed-node3/include_tasks 41445 1727204213.64948: worker is 1 (out of 1 available) 41445 1727204213.64961: exiting _queue_task() for managed-node3/include_tasks 41445 1727204213.64973: done queuing things up, now waiting for results queue to drain 41445 1727204213.64974: waiting for pending results... 41445 1727204213.65614: running TaskExecutor() for managed-node3/TASK: Include the task 'delete_interface.yml' 41445 1727204213.65851: in run() - task 028d2410-947f-bf02-eee4-0000000000cf 41445 1727204213.66067: variable 'ansible_search_path' from source: unknown 41445 1727204213.66073: calling self._execute() 41445 1727204213.66221: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204213.66233: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204213.66250: variable 'omit' from source: magic vars 41445 1727204213.67020: variable 'ansible_distribution_major_version' from source: facts 41445 1727204213.67383: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204213.67386: _execute() done 41445 1727204213.67390: dumping result to json 41445 1727204213.67393: done dumping result, returning 41445 1727204213.67395: done running TaskExecutor() for managed-node3/TASK: Include the task 'delete_interface.yml' [028d2410-947f-bf02-eee4-0000000000cf] 41445 1727204213.67398: sending task result for task 028d2410-947f-bf02-eee4-0000000000cf 41445 1727204213.67478: done sending task result for task 028d2410-947f-bf02-eee4-0000000000cf 41445 1727204213.67482: WORKER PROCESS EXITING 41445 1727204213.67510: no more pending results, returning what we have 41445 1727204213.67516: in VariableManager get_vars() 41445 1727204213.67551: Calling all_inventory to load vars for managed-node3 41445 1727204213.67554: Calling groups_inventory to load vars for managed-node3 41445 1727204213.67557: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204213.67571: Calling all_plugins_play to load vars for managed-node3 41445 1727204213.67577: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204213.67580: Calling groups_plugins_play to load vars for managed-node3 41445 1727204213.69923: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204213.71931: done with get_vars() 41445 1727204213.71954: variable 'ansible_search_path' from source: unknown 41445 1727204213.71970: we have included files to process 41445 1727204213.71971: generating all_blocks data 41445 1727204213.71973: done generating all_blocks data 41445 1727204213.71974: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 41445 1727204213.71975: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 41445 1727204213.71981: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 41445 1727204213.72271: done processing included file 41445 1727204213.72273: iterating over new_blocks loaded from include file 41445 1727204213.72274: in VariableManager get_vars() 41445 1727204213.72288: done with get_vars() 41445 1727204213.72290: filtering new block on tags 41445 1727204213.72305: done filtering new block on tags 41445 1727204213.72308: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml for managed-node3 41445 1727204213.72315: extending task lists for all hosts with included blocks 41445 1727204213.72350: done extending task lists 41445 1727204213.72351: done processing included files 41445 1727204213.72352: results queue empty 41445 1727204213.72353: checking for any_errors_fatal 41445 1727204213.72354: done checking for any_errors_fatal 41445 1727204213.72355: checking for max_fail_percentage 41445 1727204213.72356: done checking for max_fail_percentage 41445 1727204213.72357: checking to see if all hosts have failed and the running result is not ok 41445 1727204213.72357: done checking to see if all hosts have failed 41445 1727204213.72358: getting the remaining hosts for this loop 41445 1727204213.72359: done getting the remaining hosts for this loop 41445 1727204213.72361: getting the next task for host managed-node3 41445 1727204213.72365: done getting next task for host managed-node3 41445 1727204213.72367: ^ task is: TASK: Remove test interface if necessary 41445 1727204213.72369: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204213.72371: getting variables 41445 1727204213.72372: in VariableManager get_vars() 41445 1727204213.72383: Calling all_inventory to load vars for managed-node3 41445 1727204213.72385: Calling groups_inventory to load vars for managed-node3 41445 1727204213.72387: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204213.72393: Calling all_plugins_play to load vars for managed-node3 41445 1727204213.72395: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204213.72397: Calling groups_plugins_play to load vars for managed-node3 41445 1727204213.74508: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204213.77689: done with get_vars() 41445 1727204213.77846: done getting variables 41445 1727204213.78050: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Remove test interface if necessary] ************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml:3 Tuesday 24 September 2024 14:56:53 -0400 (0:00:00.138) 0:00:32.568 ***** 41445 1727204213.78085: entering _queue_task() for managed-node3/command 41445 1727204213.78893: worker is 1 (out of 1 available) 41445 1727204213.78903: exiting _queue_task() for managed-node3/command 41445 1727204213.78915: done queuing things up, now waiting for results queue to drain 41445 1727204213.78916: waiting for pending results... 41445 1727204213.79124: running TaskExecutor() for managed-node3/TASK: Remove test interface if necessary 41445 1727204213.79361: in run() - task 028d2410-947f-bf02-eee4-000000000780 41445 1727204213.79366: variable 'ansible_search_path' from source: unknown 41445 1727204213.79369: variable 'ansible_search_path' from source: unknown 41445 1727204213.79373: calling self._execute() 41445 1727204213.79467: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204213.79482: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204213.79516: variable 'omit' from source: magic vars 41445 1727204213.79913: variable 'ansible_distribution_major_version' from source: facts 41445 1727204213.79935: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204213.79947: variable 'omit' from source: magic vars 41445 1727204213.79988: variable 'omit' from source: magic vars 41445 1727204213.80105: variable 'interface' from source: set_fact 41445 1727204213.80140: variable 'omit' from source: magic vars 41445 1727204213.80225: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41445 1727204213.80346: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41445 1727204213.80353: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41445 1727204213.80369: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204213.80416: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204213.80500: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41445 1727204213.80509: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204213.80554: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204213.80731: Set connection var ansible_shell_executable to /bin/sh 41445 1727204213.80840: Set connection var ansible_shell_type to sh 41445 1727204213.80843: Set connection var ansible_pipelining to False 41445 1727204213.80845: Set connection var ansible_timeout to 10 41445 1727204213.80848: Set connection var ansible_connection to ssh 41445 1727204213.80850: Set connection var ansible_module_compression to ZIP_DEFLATED 41445 1727204213.80852: variable 'ansible_shell_executable' from source: unknown 41445 1727204213.80899: variable 'ansible_connection' from source: unknown 41445 1727204213.81001: variable 'ansible_module_compression' from source: unknown 41445 1727204213.81005: variable 'ansible_shell_type' from source: unknown 41445 1727204213.81008: variable 'ansible_shell_executable' from source: unknown 41445 1727204213.81013: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204213.81015: variable 'ansible_pipelining' from source: unknown 41445 1727204213.81018: variable 'ansible_timeout' from source: unknown 41445 1727204213.81021: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204213.81335: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41445 1727204213.81437: variable 'omit' from source: magic vars 41445 1727204213.81440: starting attempt loop 41445 1727204213.81442: running the handler 41445 1727204213.81444: _low_level_execute_command(): starting 41445 1727204213.81545: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41445 1727204213.82577: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204213.82664: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204213.82723: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204213.82758: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204213.82879: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204213.82931: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204213.84589: stdout chunk (state=3): >>>/root <<< 41445 1727204213.84747: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204213.84751: stdout chunk (state=3): >>><<< 41445 1727204213.84753: stderr chunk (state=3): >>><<< 41445 1727204213.84779: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204213.84882: _low_level_execute_command(): starting 41445 1727204213.84886: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204213.8478782-43441-142571663695717 `" && echo ansible-tmp-1727204213.8478782-43441-142571663695717="` echo /root/.ansible/tmp/ansible-tmp-1727204213.8478782-43441-142571663695717 `" ) && sleep 0' 41445 1727204213.85471: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204213.85489: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204213.85504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204213.85531: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204213.85548: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 <<< 41445 1727204213.85597: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found <<< 41445 1727204213.85637: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204213.85692: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204213.85720: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204213.85762: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204213.85874: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204213.87724: stdout chunk (state=3): >>>ansible-tmp-1727204213.8478782-43441-142571663695717=/root/.ansible/tmp/ansible-tmp-1727204213.8478782-43441-142571663695717 <<< 41445 1727204213.87881: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204213.87888: stdout chunk (state=3): >>><<< 41445 1727204213.87891: stderr chunk (state=3): >>><<< 41445 1727204213.87913: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204213.8478782-43441-142571663695717=/root/.ansible/tmp/ansible-tmp-1727204213.8478782-43441-142571663695717 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204213.87997: variable 'ansible_module_compression' from source: unknown 41445 1727204213.88012: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-414450s0ylvj1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 41445 1727204213.88053: variable 'ansible_facts' from source: unknown 41445 1727204213.88150: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204213.8478782-43441-142571663695717/AnsiballZ_command.py 41445 1727204213.88350: Sending initial data 41445 1727204213.88353: Sent initial data (156 bytes) 41445 1727204213.88939: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204213.88953: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204213.88991: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204213.89078: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204213.89102: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204213.89164: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204213.90866: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41445 1727204213.90885: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41445 1727204213.90888: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-414450s0ylvj1/tmp4d4vylk6 /root/.ansible/tmp/ansible-tmp-1727204213.8478782-43441-142571663695717/AnsiballZ_command.py <<< 41445 1727204213.90890: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204213.8478782-43441-142571663695717/AnsiballZ_command.py" <<< 41445 1727204213.90892: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-414450s0ylvj1/tmp4d4vylk6" to remote "/root/.ansible/tmp/ansible-tmp-1727204213.8478782-43441-142571663695717/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204213.8478782-43441-142571663695717/AnsiballZ_command.py" <<< 41445 1727204213.91682: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204213.91685: stdout chunk (state=3): >>><<< 41445 1727204213.91687: stderr chunk (state=3): >>><<< 41445 1727204213.91689: done transferring module to remote 41445 1727204213.91783: _low_level_execute_command(): starting 41445 1727204213.91787: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204213.8478782-43441-142571663695717/ /root/.ansible/tmp/ansible-tmp-1727204213.8478782-43441-142571663695717/AnsiballZ_command.py && sleep 0' 41445 1727204213.92688: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204213.92698: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204213.92712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204213.92733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204213.92736: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 <<< 41445 1727204213.92770: stderr chunk (state=3): >>>debug2: match not found <<< 41445 1727204213.92774: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204213.92779: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41445 1727204213.92785: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.22 is address <<< 41445 1727204213.92788: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41445 1727204213.92790: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204213.92792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204213.92805: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204213.92842: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 <<< 41445 1727204213.92845: stderr chunk (state=3): >>>debug2: match found <<< 41445 1727204213.92847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204213.92891: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204213.92902: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204213.92920: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204213.92981: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204213.94688: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204213.94743: stderr chunk (state=3): >>><<< 41445 1727204213.94760: stdout chunk (state=3): >>><<< 41445 1727204213.94787: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204213.94799: _low_level_execute_command(): starting 41445 1727204213.94809: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204213.8478782-43441-142571663695717/AnsiballZ_command.py && sleep 0' 41445 1727204213.95462: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204213.95515: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204213.95535: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204213.95556: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204213.95640: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204214.11741: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "ethtest0"], "start": "2024-09-24 14:56:54.103441", "end": "2024-09-24 14:56:54.114584", "delta": "0:00:00.011143", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del ethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 41445 1727204214.14120: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. <<< 41445 1727204214.14124: stdout chunk (state=3): >>><<< 41445 1727204214.14127: stderr chunk (state=3): >>><<< 41445 1727204214.14129: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "ethtest0"], "start": "2024-09-24 14:56:54.103441", "end": "2024-09-24 14:56:54.114584", "delta": "0:00:00.011143", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del ethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. 41445 1727204214.14132: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204213.8478782-43441-142571663695717/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41445 1727204214.14135: _low_level_execute_command(): starting 41445 1727204214.14137: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204213.8478782-43441-142571663695717/ > /dev/null 2>&1 && sleep 0' 41445 1727204214.15353: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204214.15373: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204214.15426: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 41445 1727204214.15541: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204214.15620: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204214.15667: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204214.15736: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204214.17574: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204214.17615: stdout chunk (state=3): >>><<< 41445 1727204214.17629: stderr chunk (state=3): >>><<< 41445 1727204214.17702: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204214.17724: handler run complete 41445 1727204214.17755: Evaluated conditional (False): False 41445 1727204214.17795: attempt loop complete, returning result 41445 1727204214.17826: _execute() done 41445 1727204214.17834: dumping result to json 41445 1727204214.17843: done dumping result, returning 41445 1727204214.17892: done running TaskExecutor() for managed-node3/TASK: Remove test interface if necessary [028d2410-947f-bf02-eee4-000000000780] 41445 1727204214.17904: sending task result for task 028d2410-947f-bf02-eee4-000000000780 41445 1727204214.18283: done sending task result for task 028d2410-947f-bf02-eee4-000000000780 41445 1727204214.18288: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": [ "ip", "link", "del", "ethtest0" ], "delta": "0:00:00.011143", "end": "2024-09-24 14:56:54.114584", "rc": 0, "start": "2024-09-24 14:56:54.103441" } 41445 1727204214.18360: no more pending results, returning what we have 41445 1727204214.18364: results queue empty 41445 1727204214.18365: checking for any_errors_fatal 41445 1727204214.18366: done checking for any_errors_fatal 41445 1727204214.18367: checking for max_fail_percentage 41445 1727204214.18368: done checking for max_fail_percentage 41445 1727204214.18369: checking to see if all hosts have failed and the running result is not ok 41445 1727204214.18370: done checking to see if all hosts have failed 41445 1727204214.18371: getting the remaining hosts for this loop 41445 1727204214.18372: done getting the remaining hosts for this loop 41445 1727204214.18378: getting the next task for host managed-node3 41445 1727204214.18388: done getting next task for host managed-node3 41445 1727204214.18390: ^ task is: TASK: meta (flush_handlers) 41445 1727204214.18392: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204214.18397: getting variables 41445 1727204214.18398: in VariableManager get_vars() 41445 1727204214.18432: Calling all_inventory to load vars for managed-node3 41445 1727204214.18435: Calling groups_inventory to load vars for managed-node3 41445 1727204214.18439: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204214.18449: Calling all_plugins_play to load vars for managed-node3 41445 1727204214.18453: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204214.18456: Calling groups_plugins_play to load vars for managed-node3 41445 1727204214.22430: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204214.26777: done with get_vars() 41445 1727204214.26801: done getting variables 41445 1727204214.26871: in VariableManager get_vars() 41445 1727204214.26986: Calling all_inventory to load vars for managed-node3 41445 1727204214.26989: Calling groups_inventory to load vars for managed-node3 41445 1727204214.26992: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204214.26997: Calling all_plugins_play to load vars for managed-node3 41445 1727204214.26999: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204214.27002: Calling groups_plugins_play to load vars for managed-node3 41445 1727204214.29680: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204214.32764: done with get_vars() 41445 1727204214.32970: done queuing things up, now waiting for results queue to drain 41445 1727204214.32972: results queue empty 41445 1727204214.32973: checking for any_errors_fatal 41445 1727204214.32979: done checking for any_errors_fatal 41445 1727204214.32980: checking for max_fail_percentage 41445 1727204214.32981: done checking for max_fail_percentage 41445 1727204214.32981: checking to see if all hosts have failed and the running result is not ok 41445 1727204214.32982: done checking to see if all hosts have failed 41445 1727204214.32983: getting the remaining hosts for this loop 41445 1727204214.32984: done getting the remaining hosts for this loop 41445 1727204214.32987: getting the next task for host managed-node3 41445 1727204214.32991: done getting next task for host managed-node3 41445 1727204214.32993: ^ task is: TASK: meta (flush_handlers) 41445 1727204214.32994: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204214.32997: getting variables 41445 1727204214.32999: in VariableManager get_vars() 41445 1727204214.33008: Calling all_inventory to load vars for managed-node3 41445 1727204214.33010: Calling groups_inventory to load vars for managed-node3 41445 1727204214.33012: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204214.33018: Calling all_plugins_play to load vars for managed-node3 41445 1727204214.33025: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204214.33028: Calling groups_plugins_play to load vars for managed-node3 41445 1727204214.36264: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204214.39499: done with get_vars() 41445 1727204214.39529: done getting variables 41445 1727204214.39787: in VariableManager get_vars() 41445 1727204214.39798: Calling all_inventory to load vars for managed-node3 41445 1727204214.39801: Calling groups_inventory to load vars for managed-node3 41445 1727204214.39803: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204214.39808: Calling all_plugins_play to load vars for managed-node3 41445 1727204214.39811: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204214.39813: Calling groups_plugins_play to load vars for managed-node3 41445 1727204214.42404: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204214.45589: done with get_vars() 41445 1727204214.45623: done queuing things up, now waiting for results queue to drain 41445 1727204214.45625: results queue empty 41445 1727204214.45626: checking for any_errors_fatal 41445 1727204214.45627: done checking for any_errors_fatal 41445 1727204214.45628: checking for max_fail_percentage 41445 1727204214.45629: done checking for max_fail_percentage 41445 1727204214.45630: checking to see if all hosts have failed and the running result is not ok 41445 1727204214.45630: done checking to see if all hosts have failed 41445 1727204214.45631: getting the remaining hosts for this loop 41445 1727204214.45632: done getting the remaining hosts for this loop 41445 1727204214.45634: getting the next task for host managed-node3 41445 1727204214.45638: done getting next task for host managed-node3 41445 1727204214.45638: ^ task is: None 41445 1727204214.45640: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204214.45641: done queuing things up, now waiting for results queue to drain 41445 1727204214.45642: results queue empty 41445 1727204214.45642: checking for any_errors_fatal 41445 1727204214.45643: done checking for any_errors_fatal 41445 1727204214.45644: checking for max_fail_percentage 41445 1727204214.45645: done checking for max_fail_percentage 41445 1727204214.45645: checking to see if all hosts have failed and the running result is not ok 41445 1727204214.45646: done checking to see if all hosts have failed 41445 1727204214.45647: getting the next task for host managed-node3 41445 1727204214.45650: done getting next task for host managed-node3 41445 1727204214.45650: ^ task is: None 41445 1727204214.45652: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204214.45693: in VariableManager get_vars() 41445 1727204214.45716: done with get_vars() 41445 1727204214.45723: in VariableManager get_vars() 41445 1727204214.45737: done with get_vars() 41445 1727204214.45742: variable 'omit' from source: magic vars 41445 1727204214.45858: variable 'profile' from source: play vars 41445 1727204214.46164: in VariableManager get_vars() 41445 1727204214.46182: done with get_vars() 41445 1727204214.46204: variable 'omit' from source: magic vars 41445 1727204214.46267: variable 'profile' from source: play vars PLAY [Remove {{ profile }}] **************************************************** 41445 1727204214.47951: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 41445 1727204214.48127: getting the remaining hosts for this loop 41445 1727204214.48129: done getting the remaining hosts for this loop 41445 1727204214.48132: getting the next task for host managed-node3 41445 1727204214.48135: done getting next task for host managed-node3 41445 1727204214.48137: ^ task is: TASK: Gathering Facts 41445 1727204214.48138: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204214.48140: getting variables 41445 1727204214.48141: in VariableManager get_vars() 41445 1727204214.48154: Calling all_inventory to load vars for managed-node3 41445 1727204214.48157: Calling groups_inventory to load vars for managed-node3 41445 1727204214.48159: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204214.48164: Calling all_plugins_play to load vars for managed-node3 41445 1727204214.48167: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204214.48169: Calling groups_plugins_play to load vars for managed-node3 41445 1727204214.50961: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204214.53478: done with get_vars() 41445 1727204214.53508: done getting variables 41445 1727204214.53561: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 Tuesday 24 September 2024 14:56:54 -0400 (0:00:00.755) 0:00:33.323 ***** 41445 1727204214.53600: entering _queue_task() for managed-node3/gather_facts 41445 1727204214.54206: worker is 1 (out of 1 available) 41445 1727204214.54214: exiting _queue_task() for managed-node3/gather_facts 41445 1727204214.54224: done queuing things up, now waiting for results queue to drain 41445 1727204214.54226: waiting for pending results... 41445 1727204214.54570: running TaskExecutor() for managed-node3/TASK: Gathering Facts 41445 1727204214.54869: in run() - task 028d2410-947f-bf02-eee4-00000000078e 41445 1727204214.55383: variable 'ansible_search_path' from source: unknown 41445 1727204214.55391: calling self._execute() 41445 1727204214.55424: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204214.55437: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204214.55451: variable 'omit' from source: magic vars 41445 1727204214.56156: variable 'ansible_distribution_major_version' from source: facts 41445 1727204214.56383: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204214.56386: variable 'omit' from source: magic vars 41445 1727204214.56388: variable 'omit' from source: magic vars 41445 1727204214.56390: variable 'omit' from source: magic vars 41445 1727204214.56502: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41445 1727204214.56543: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41445 1727204214.56622: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41445 1727204214.56645: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204214.56816: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204214.56820: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41445 1727204214.56823: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204214.56825: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204214.56969: Set connection var ansible_shell_executable to /bin/sh 41445 1727204214.57040: Set connection var ansible_shell_type to sh 41445 1727204214.57051: Set connection var ansible_pipelining to False 41445 1727204214.57064: Set connection var ansible_timeout to 10 41445 1727204214.57070: Set connection var ansible_connection to ssh 41445 1727204214.57084: Set connection var ansible_module_compression to ZIP_DEFLATED 41445 1727204214.57117: variable 'ansible_shell_executable' from source: unknown 41445 1727204214.57382: variable 'ansible_connection' from source: unknown 41445 1727204214.57385: variable 'ansible_module_compression' from source: unknown 41445 1727204214.57388: variable 'ansible_shell_type' from source: unknown 41445 1727204214.57391: variable 'ansible_shell_executable' from source: unknown 41445 1727204214.57393: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204214.57395: variable 'ansible_pipelining' from source: unknown 41445 1727204214.57397: variable 'ansible_timeout' from source: unknown 41445 1727204214.57399: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204214.57569: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41445 1727204214.57708: variable 'omit' from source: magic vars 41445 1727204214.57712: starting attempt loop 41445 1727204214.57714: running the handler 41445 1727204214.57731: variable 'ansible_facts' from source: unknown 41445 1727204214.57755: _low_level_execute_command(): starting 41445 1727204214.57769: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41445 1727204214.58785: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204214.58803: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204214.58818: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204214.58915: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204214.58934: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204214.58953: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204214.58979: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204214.59106: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204214.60970: stdout chunk (state=3): >>>/root <<< 41445 1727204214.60988: stdout chunk (state=3): >>><<< 41445 1727204214.61300: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204214.61304: stderr chunk (state=3): >>><<< 41445 1727204214.61310: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204214.61313: _low_level_execute_command(): starting 41445 1727204214.61316: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204214.6121156-43522-39790811269153 `" && echo ansible-tmp-1727204214.6121156-43522-39790811269153="` echo /root/.ansible/tmp/ansible-tmp-1727204214.6121156-43522-39790811269153 `" ) && sleep 0' 41445 1727204214.62435: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204214.62490: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204214.62679: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204214.62701: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204214.62714: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204214.62821: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204214.64696: stdout chunk (state=3): >>>ansible-tmp-1727204214.6121156-43522-39790811269153=/root/.ansible/tmp/ansible-tmp-1727204214.6121156-43522-39790811269153 <<< 41445 1727204214.64842: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204214.64852: stdout chunk (state=3): >>><<< 41445 1727204214.64892: stderr chunk (state=3): >>><<< 41445 1727204214.64914: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204214.6121156-43522-39790811269153=/root/.ansible/tmp/ansible-tmp-1727204214.6121156-43522-39790811269153 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204214.64978: variable 'ansible_module_compression' from source: unknown 41445 1727204214.65106: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-414450s0ylvj1/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 41445 1727204214.65227: variable 'ansible_facts' from source: unknown 41445 1727204214.65703: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204214.6121156-43522-39790811269153/AnsiballZ_setup.py 41445 1727204214.66133: Sending initial data 41445 1727204214.66136: Sent initial data (153 bytes) 41445 1727204214.67996: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204214.68113: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204214.68237: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204214.68551: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204214.70007: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41445 1727204214.70194: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204214.6121156-43522-39790811269153/AnsiballZ_setup.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-414450s0ylvj1/tmpmek_p7_h" to remote "/root/.ansible/tmp/ansible-tmp-1727204214.6121156-43522-39790811269153/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204214.6121156-43522-39790811269153/AnsiballZ_setup.py" <<< 41445 1727204214.70198: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-414450s0ylvj1/tmpmek_p7_h /root/.ansible/tmp/ansible-tmp-1727204214.6121156-43522-39790811269153/AnsiballZ_setup.py <<< 41445 1727204214.72667: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204214.72689: stderr chunk (state=3): >>><<< 41445 1727204214.72699: stdout chunk (state=3): >>><<< 41445 1727204214.72727: done transferring module to remote 41445 1727204214.72805: _low_level_execute_command(): starting 41445 1727204214.72835: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204214.6121156-43522-39790811269153/ /root/.ansible/tmp/ansible-tmp-1727204214.6121156-43522-39790811269153/AnsiballZ_setup.py && sleep 0' 41445 1727204214.74297: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204214.74320: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204214.74328: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204214.74346: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204214.74434: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204214.76274: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204214.76323: stderr chunk (state=3): >>><<< 41445 1727204214.76334: stdout chunk (state=3): >>><<< 41445 1727204214.76367: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204214.76469: _low_level_execute_command(): starting 41445 1727204214.76484: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204214.6121156-43522-39790811269153/AnsiballZ_setup.py && sleep 0' 41445 1727204214.77625: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204214.77791: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204214.77961: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204214.78001: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204215.41191: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC83yKiaGAYjfqsqlfpPMYFAYI2IZVpX8WgNfvPCdI/QOKjuzs4b6SbV/Bm0ogtp9Et9STTGoHBvp3tMYQ6i0y/9DHMBxtiHYJ+rIzJ/YNNMqKc+fMqexyOsi7rKKuzfxXrYU8uPBHq5WU3WAZmJxJn+UHZGog8hUnJ8momdJG+aYo9El3Qce4gVdwORcmHZUOa49M8lLCwTovtYArmkGETUVJ+Jk8huVTzYpASWxxcw6zOvUcn52HC6dmNQv/T+k2uW6UW0rybwIrVUlZXRNODrXs8kCGgOx1OI0XYB3FndJOnORF4A9Y6onLo/zUCEaO8Pi19mcfSbo2v+bmotTVk5jcmvR3jhVYJmJE6a+dQpjSZolSqMv8mI9tkztfxM6bJlNSZcTrvZEzu7cbiE38Pp/Ku143n9iGgWNmUQ2FhUDpoWEhoA767VUunE48P8ivpVZ/u5aEyupZSLEuWEvCLmmGPVcf9hVbcXw0n8RFvUwDdD8WARVhN5GJFUVN5JM0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHm1sMDuVWGACN5pHFDkl6tR93F90YCY4cFGcXcCoQnN+oT963FmBwTMMlfDIm4G2OUATCZuz6QFZP9trAaUzXo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIAB71QKijTWbanEvrb0ex0kLr0wX6qyv6naldRWNiIFP", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_apparmor": {"status": "disabled"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec25272c602494034078bc876e25857f", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fibre_channel_wwn": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2935, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 596, "free": 2935}, "nocache": {"free": 3275, "used": 256}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec25272c-6024-9403-4078-bc876e25857f", "ansible_product_uuid": "ec25272c-6024-9403-4078-bc876e25857f", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca87<<< 41445 1727204215.41267: stdout chunk (state=3): >>>0-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 792, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261788700672, "block_size": 4096, "block_total": 65519099, "block_available": 63913257, "block_used": 1605842, "inode_total": 131070960, "inode_available": 131027340, "inode_used": 43620, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 53526 10.31.47.22 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 53526 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "56", "second": "55", "epoch": "1727204215", "epoch_int": "1727204215", "date": "2024-09-24", "time": "14:56:55", "iso8601_micro": "2024-09-24T18:56:55.360529Z", "iso8601": "2024-09-24T18:56:55Z", "iso8601_basic": "20240924T145655360529", "iso8601_basic_short": "20240924T145655", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_is_chroot": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_loadavg": {"1m": 0.66943359375, "5m": 0.54541015625, "15m": 0.31982421875}, "ansible_local": {}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_fips": false, "ansible_interfaces": ["rpltstbr", "lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "02:83:38:1a:ae:4d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.47.22", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::83:38ff:fe1a:ae4d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "26:cf:9a:9b:f7:ee", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.47.22", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:83:38:1a:ae:4d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.47.22", "192.0.2.72"], "ansible_all_ipv6_addresses": ["fe80::83:38ff:fe1a:ae4d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.47.22", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::83:38ff:fe1a:ae4d"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_lsb": {}, "ansible_iscsi_iqn": "", "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 41445 1727204215.43175: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. <<< 41445 1727204215.43189: stdout chunk (state=3): >>><<< 41445 1727204215.43203: stderr chunk (state=3): >>><<< 41445 1727204215.43418: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC83yKiaGAYjfqsqlfpPMYFAYI2IZVpX8WgNfvPCdI/QOKjuzs4b6SbV/Bm0ogtp9Et9STTGoHBvp3tMYQ6i0y/9DHMBxtiHYJ+rIzJ/YNNMqKc+fMqexyOsi7rKKuzfxXrYU8uPBHq5WU3WAZmJxJn+UHZGog8hUnJ8momdJG+aYo9El3Qce4gVdwORcmHZUOa49M8lLCwTovtYArmkGETUVJ+Jk8huVTzYpASWxxcw6zOvUcn52HC6dmNQv/T+k2uW6UW0rybwIrVUlZXRNODrXs8kCGgOx1OI0XYB3FndJOnORF4A9Y6onLo/zUCEaO8Pi19mcfSbo2v+bmotTVk5jcmvR3jhVYJmJE6a+dQpjSZolSqMv8mI9tkztfxM6bJlNSZcTrvZEzu7cbiE38Pp/Ku143n9iGgWNmUQ2FhUDpoWEhoA767VUunE48P8ivpVZ/u5aEyupZSLEuWEvCLmmGPVcf9hVbcXw0n8RFvUwDdD8WARVhN5GJFUVN5JM0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHm1sMDuVWGACN5pHFDkl6tR93F90YCY4cFGcXcCoQnN+oT963FmBwTMMlfDIm4G2OUATCZuz6QFZP9trAaUzXo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIAB71QKijTWbanEvrb0ex0kLr0wX6qyv6naldRWNiIFP", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_apparmor": {"status": "disabled"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec25272c602494034078bc876e25857f", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fibre_channel_wwn": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2935, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 596, "free": 2935}, "nocache": {"free": 3275, "used": 256}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec25272c-6024-9403-4078-bc876e25857f", "ansible_product_uuid": "ec25272c-6024-9403-4078-bc876e25857f", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 792, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261788700672, "block_size": 4096, "block_total": 65519099, "block_available": 63913257, "block_used": 1605842, "inode_total": 131070960, "inode_available": 131027340, "inode_used": 43620, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 53526 10.31.47.22 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 53526 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "56", "second": "55", "epoch": "1727204215", "epoch_int": "1727204215", "date": "2024-09-24", "time": "14:56:55", "iso8601_micro": "2024-09-24T18:56:55.360529Z", "iso8601": "2024-09-24T18:56:55Z", "iso8601_basic": "20240924T145655360529", "iso8601_basic_short": "20240924T145655", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_is_chroot": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_loadavg": {"1m": 0.66943359375, "5m": 0.54541015625, "15m": 0.31982421875}, "ansible_local": {}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_fips": false, "ansible_interfaces": ["rpltstbr", "lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "02:83:38:1a:ae:4d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.47.22", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::83:38ff:fe1a:ae4d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "26:cf:9a:9b:f7:ee", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.47.22", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:83:38:1a:ae:4d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.47.22", "192.0.2.72"], "ansible_all_ipv6_addresses": ["fe80::83:38ff:fe1a:ae4d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.47.22", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::83:38ff:fe1a:ae4d"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_lsb": {}, "ansible_iscsi_iqn": "", "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. 41445 1727204215.44329: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204214.6121156-43522-39790811269153/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41445 1727204215.44417: _low_level_execute_command(): starting 41445 1727204215.44427: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204214.6121156-43522-39790811269153/ > /dev/null 2>&1 && sleep 0' 41445 1727204215.45967: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204215.46091: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204215.46171: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204215.46307: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204215.48205: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204215.48253: stdout chunk (state=3): >>><<< 41445 1727204215.48266: stderr chunk (state=3): >>><<< 41445 1727204215.48554: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204215.48557: handler run complete 41445 1727204215.48655: variable 'ansible_facts' from source: unknown 41445 1727204215.49082: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204215.50017: variable 'ansible_facts' from source: unknown 41445 1727204215.50437: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204215.50713: attempt loop complete, returning result 41445 1727204215.50724: _execute() done 41445 1727204215.50732: dumping result to json 41445 1727204215.50769: done dumping result, returning 41445 1727204215.50784: done running TaskExecutor() for managed-node3/TASK: Gathering Facts [028d2410-947f-bf02-eee4-00000000078e] 41445 1727204215.50796: sending task result for task 028d2410-947f-bf02-eee4-00000000078e 41445 1727204215.51782: done sending task result for task 028d2410-947f-bf02-eee4-00000000078e 41445 1727204215.51786: WORKER PROCESS EXITING ok: [managed-node3] 41445 1727204215.52293: no more pending results, returning what we have 41445 1727204215.52296: results queue empty 41445 1727204215.52297: checking for any_errors_fatal 41445 1727204215.52298: done checking for any_errors_fatal 41445 1727204215.52299: checking for max_fail_percentage 41445 1727204215.52301: done checking for max_fail_percentage 41445 1727204215.52302: checking to see if all hosts have failed and the running result is not ok 41445 1727204215.52303: done checking to see if all hosts have failed 41445 1727204215.52304: getting the remaining hosts for this loop 41445 1727204215.52305: done getting the remaining hosts for this loop 41445 1727204215.52309: getting the next task for host managed-node3 41445 1727204215.52314: done getting next task for host managed-node3 41445 1727204215.52316: ^ task is: TASK: meta (flush_handlers) 41445 1727204215.52318: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204215.52322: getting variables 41445 1727204215.52323: in VariableManager get_vars() 41445 1727204215.52352: Calling all_inventory to load vars for managed-node3 41445 1727204215.52354: Calling groups_inventory to load vars for managed-node3 41445 1727204215.52357: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204215.52366: Calling all_plugins_play to load vars for managed-node3 41445 1727204215.52369: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204215.52372: Calling groups_plugins_play to load vars for managed-node3 41445 1727204215.55903: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204215.57931: done with get_vars() 41445 1727204215.57964: done getting variables 41445 1727204215.58052: in VariableManager get_vars() 41445 1727204215.58065: Calling all_inventory to load vars for managed-node3 41445 1727204215.58067: Calling groups_inventory to load vars for managed-node3 41445 1727204215.58069: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204215.58074: Calling all_plugins_play to load vars for managed-node3 41445 1727204215.58078: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204215.58081: Calling groups_plugins_play to load vars for managed-node3 41445 1727204215.59259: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204215.61102: done with get_vars() 41445 1727204215.61130: done queuing things up, now waiting for results queue to drain 41445 1727204215.61132: results queue empty 41445 1727204215.61133: checking for any_errors_fatal 41445 1727204215.61141: done checking for any_errors_fatal 41445 1727204215.61142: checking for max_fail_percentage 41445 1727204215.61143: done checking for max_fail_percentage 41445 1727204215.61144: checking to see if all hosts have failed and the running result is not ok 41445 1727204215.61144: done checking to see if all hosts have failed 41445 1727204215.61150: getting the remaining hosts for this loop 41445 1727204215.61151: done getting the remaining hosts for this loop 41445 1727204215.61154: getting the next task for host managed-node3 41445 1727204215.61158: done getting next task for host managed-node3 41445 1727204215.61161: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 41445 1727204215.61162: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204215.61172: getting variables 41445 1727204215.61173: in VariableManager get_vars() 41445 1727204215.61189: Calling all_inventory to load vars for managed-node3 41445 1727204215.61191: Calling groups_inventory to load vars for managed-node3 41445 1727204215.61193: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204215.61197: Calling all_plugins_play to load vars for managed-node3 41445 1727204215.61200: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204215.61202: Calling groups_plugins_play to load vars for managed-node3 41445 1727204215.68294: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204215.70155: done with get_vars() 41445 1727204215.70258: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:56:55 -0400 (0:00:01.167) 0:00:34.491 ***** 41445 1727204215.70333: entering _queue_task() for managed-node3/include_tasks 41445 1727204215.70790: worker is 1 (out of 1 available) 41445 1727204215.70914: exiting _queue_task() for managed-node3/include_tasks 41445 1727204215.70925: done queuing things up, now waiting for results queue to drain 41445 1727204215.70926: waiting for pending results... 41445 1727204215.71113: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 41445 1727204215.71242: in run() - task 028d2410-947f-bf02-eee4-0000000000d7 41445 1727204215.71266: variable 'ansible_search_path' from source: unknown 41445 1727204215.71273: variable 'ansible_search_path' from source: unknown 41445 1727204215.71314: calling self._execute() 41445 1727204215.71448: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204215.71452: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204215.71455: variable 'omit' from source: magic vars 41445 1727204215.71922: variable 'ansible_distribution_major_version' from source: facts 41445 1727204215.71939: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204215.71981: _execute() done 41445 1727204215.71989: dumping result to json 41445 1727204215.71992: done dumping result, returning 41445 1727204215.71995: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [028d2410-947f-bf02-eee4-0000000000d7] 41445 1727204215.71997: sending task result for task 028d2410-947f-bf02-eee4-0000000000d7 41445 1727204215.72227: done sending task result for task 028d2410-947f-bf02-eee4-0000000000d7 41445 1727204215.72230: WORKER PROCESS EXITING 41445 1727204215.72268: no more pending results, returning what we have 41445 1727204215.72274: in VariableManager get_vars() 41445 1727204215.72394: Calling all_inventory to load vars for managed-node3 41445 1727204215.72398: Calling groups_inventory to load vars for managed-node3 41445 1727204215.72400: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204215.72412: Calling all_plugins_play to load vars for managed-node3 41445 1727204215.72419: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204215.72424: Calling groups_plugins_play to load vars for managed-node3 41445 1727204215.73869: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204215.76103: done with get_vars() 41445 1727204215.76133: variable 'ansible_search_path' from source: unknown 41445 1727204215.76135: variable 'ansible_search_path' from source: unknown 41445 1727204215.76175: we have included files to process 41445 1727204215.76177: generating all_blocks data 41445 1727204215.76179: done generating all_blocks data 41445 1727204215.76180: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 41445 1727204215.76181: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 41445 1727204215.76183: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 41445 1727204215.76844: done processing included file 41445 1727204215.76846: iterating over new_blocks loaded from include file 41445 1727204215.76848: in VariableManager get_vars() 41445 1727204215.76872: done with get_vars() 41445 1727204215.76875: filtering new block on tags 41445 1727204215.76894: done filtering new block on tags 41445 1727204215.76897: in VariableManager get_vars() 41445 1727204215.76931: done with get_vars() 41445 1727204215.76933: filtering new block on tags 41445 1727204215.76953: done filtering new block on tags 41445 1727204215.76956: in VariableManager get_vars() 41445 1727204215.76978: done with get_vars() 41445 1727204215.76980: filtering new block on tags 41445 1727204215.76997: done filtering new block on tags 41445 1727204215.76999: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node3 41445 1727204215.77004: extending task lists for all hosts with included blocks 41445 1727204215.77436: done extending task lists 41445 1727204215.77438: done processing included files 41445 1727204215.77439: results queue empty 41445 1727204215.77440: checking for any_errors_fatal 41445 1727204215.77441: done checking for any_errors_fatal 41445 1727204215.77442: checking for max_fail_percentage 41445 1727204215.77443: done checking for max_fail_percentage 41445 1727204215.77444: checking to see if all hosts have failed and the running result is not ok 41445 1727204215.77445: done checking to see if all hosts have failed 41445 1727204215.77446: getting the remaining hosts for this loop 41445 1727204215.77447: done getting the remaining hosts for this loop 41445 1727204215.77449: getting the next task for host managed-node3 41445 1727204215.77454: done getting next task for host managed-node3 41445 1727204215.77457: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 41445 1727204215.77459: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204215.77477: getting variables 41445 1727204215.77479: in VariableManager get_vars() 41445 1727204215.77494: Calling all_inventory to load vars for managed-node3 41445 1727204215.77496: Calling groups_inventory to load vars for managed-node3 41445 1727204215.77498: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204215.77505: Calling all_plugins_play to load vars for managed-node3 41445 1727204215.77508: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204215.77513: Calling groups_plugins_play to load vars for managed-node3 41445 1727204215.78817: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204215.80534: done with get_vars() 41445 1727204215.80562: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:56:55 -0400 (0:00:00.103) 0:00:34.594 ***** 41445 1727204215.80652: entering _queue_task() for managed-node3/setup 41445 1727204215.81047: worker is 1 (out of 1 available) 41445 1727204215.81068: exiting _queue_task() for managed-node3/setup 41445 1727204215.81082: done queuing things up, now waiting for results queue to drain 41445 1727204215.81084: waiting for pending results... 41445 1727204215.81367: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 41445 1727204215.81571: in run() - task 028d2410-947f-bf02-eee4-0000000007cf 41445 1727204215.81577: variable 'ansible_search_path' from source: unknown 41445 1727204215.81580: variable 'ansible_search_path' from source: unknown 41445 1727204215.81584: calling self._execute() 41445 1727204215.81648: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204215.81654: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204215.81664: variable 'omit' from source: magic vars 41445 1727204215.82077: variable 'ansible_distribution_major_version' from source: facts 41445 1727204215.82090: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204215.82324: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41445 1727204215.83947: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41445 1727204215.84005: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41445 1727204215.84030: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41445 1727204215.84054: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41445 1727204215.84073: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41445 1727204215.84137: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204215.84157: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204215.84177: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204215.84204: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204215.84220: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204215.84255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204215.84271: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204215.84289: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204215.84318: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204215.84331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204215.84438: variable '__network_required_facts' from source: role '' defaults 41445 1727204215.84445: variable 'ansible_facts' from source: unknown 41445 1727204215.85331: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 41445 1727204215.85334: when evaluation is False, skipping this task 41445 1727204215.85336: _execute() done 41445 1727204215.85338: dumping result to json 41445 1727204215.85340: done dumping result, returning 41445 1727204215.85342: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [028d2410-947f-bf02-eee4-0000000007cf] 41445 1727204215.85344: sending task result for task 028d2410-947f-bf02-eee4-0000000007cf 41445 1727204215.85408: done sending task result for task 028d2410-947f-bf02-eee4-0000000007cf 41445 1727204215.85413: WORKER PROCESS EXITING skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 41445 1727204215.85473: no more pending results, returning what we have 41445 1727204215.85478: results queue empty 41445 1727204215.85479: checking for any_errors_fatal 41445 1727204215.85480: done checking for any_errors_fatal 41445 1727204215.85481: checking for max_fail_percentage 41445 1727204215.85482: done checking for max_fail_percentage 41445 1727204215.85483: checking to see if all hosts have failed and the running result is not ok 41445 1727204215.85484: done checking to see if all hosts have failed 41445 1727204215.85485: getting the remaining hosts for this loop 41445 1727204215.85486: done getting the remaining hosts for this loop 41445 1727204215.85489: getting the next task for host managed-node3 41445 1727204215.85497: done getting next task for host managed-node3 41445 1727204215.85500: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 41445 1727204215.85502: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204215.85518: getting variables 41445 1727204215.85519: in VariableManager get_vars() 41445 1727204215.85558: Calling all_inventory to load vars for managed-node3 41445 1727204215.85560: Calling groups_inventory to load vars for managed-node3 41445 1727204215.85562: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204215.85571: Calling all_plugins_play to load vars for managed-node3 41445 1727204215.85573: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204215.85634: Calling groups_plugins_play to load vars for managed-node3 41445 1727204215.86557: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204215.87634: done with get_vars() 41445 1727204215.87653: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:56:55 -0400 (0:00:00.070) 0:00:34.665 ***** 41445 1727204215.87724: entering _queue_task() for managed-node3/stat 41445 1727204215.87980: worker is 1 (out of 1 available) 41445 1727204215.87994: exiting _queue_task() for managed-node3/stat 41445 1727204215.88003: done queuing things up, now waiting for results queue to drain 41445 1727204215.88005: waiting for pending results... 41445 1727204215.88443: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 41445 1727204215.88449: in run() - task 028d2410-947f-bf02-eee4-0000000007d1 41445 1727204215.88454: variable 'ansible_search_path' from source: unknown 41445 1727204215.88457: variable 'ansible_search_path' from source: unknown 41445 1727204215.88568: calling self._execute() 41445 1727204215.88596: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204215.88604: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204215.88627: variable 'omit' from source: magic vars 41445 1727204215.89168: variable 'ansible_distribution_major_version' from source: facts 41445 1727204215.89172: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204215.89329: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41445 1727204215.90254: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41445 1727204215.90419: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41445 1727204215.90523: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41445 1727204215.90581: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41445 1727204215.90786: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41445 1727204215.90791: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41445 1727204215.90843: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204215.90907: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41445 1727204215.91015: variable '__network_is_ostree' from source: set_fact 41445 1727204215.91022: Evaluated conditional (not __network_is_ostree is defined): False 41445 1727204215.91026: when evaluation is False, skipping this task 41445 1727204215.91028: _execute() done 41445 1727204215.91105: dumping result to json 41445 1727204215.91108: done dumping result, returning 41445 1727204215.91113: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [028d2410-947f-bf02-eee4-0000000007d1] 41445 1727204215.91115: sending task result for task 028d2410-947f-bf02-eee4-0000000007d1 41445 1727204215.91389: done sending task result for task 028d2410-947f-bf02-eee4-0000000007d1 skipping: [managed-node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 41445 1727204215.91433: no more pending results, returning what we have 41445 1727204215.91437: results queue empty 41445 1727204215.91437: checking for any_errors_fatal 41445 1727204215.91444: done checking for any_errors_fatal 41445 1727204215.91445: checking for max_fail_percentage 41445 1727204215.91446: done checking for max_fail_percentage 41445 1727204215.91447: checking to see if all hosts have failed and the running result is not ok 41445 1727204215.91448: done checking to see if all hosts have failed 41445 1727204215.91449: getting the remaining hosts for this loop 41445 1727204215.91450: done getting the remaining hosts for this loop 41445 1727204215.91567: getting the next task for host managed-node3 41445 1727204215.91574: done getting next task for host managed-node3 41445 1727204215.91581: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 41445 1727204215.91583: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204215.91598: WORKER PROCESS EXITING 41445 1727204215.91684: getting variables 41445 1727204215.91686: in VariableManager get_vars() 41445 1727204215.91898: Calling all_inventory to load vars for managed-node3 41445 1727204215.91902: Calling groups_inventory to load vars for managed-node3 41445 1727204215.91905: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204215.91913: Calling all_plugins_play to load vars for managed-node3 41445 1727204215.91917: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204215.91920: Calling groups_plugins_play to load vars for managed-node3 41445 1727204215.94151: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204215.96787: done with get_vars() 41445 1727204215.96816: done getting variables 41445 1727204215.96973: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:56:55 -0400 (0:00:00.092) 0:00:34.757 ***** 41445 1727204215.97012: entering _queue_task() for managed-node3/set_fact 41445 1727204215.97510: worker is 1 (out of 1 available) 41445 1727204215.97523: exiting _queue_task() for managed-node3/set_fact 41445 1727204215.97533: done queuing things up, now waiting for results queue to drain 41445 1727204215.97534: waiting for pending results... 41445 1727204215.98097: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 41445 1727204215.98447: in run() - task 028d2410-947f-bf02-eee4-0000000007d2 41445 1727204215.98464: variable 'ansible_search_path' from source: unknown 41445 1727204215.98468: variable 'ansible_search_path' from source: unknown 41445 1727204215.98620: calling self._execute() 41445 1727204215.98803: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204215.98806: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204215.98830: variable 'omit' from source: magic vars 41445 1727204215.99633: variable 'ansible_distribution_major_version' from source: facts 41445 1727204215.99645: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204216.00587: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41445 1727204216.01133: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41445 1727204216.01337: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41445 1727204216.01370: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41445 1727204216.01564: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41445 1727204216.01927: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41445 1727204216.01952: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41445 1727204216.02207: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204216.02234: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41445 1727204216.02526: variable '__network_is_ostree' from source: set_fact 41445 1727204216.02533: Evaluated conditional (not __network_is_ostree is defined): False 41445 1727204216.02536: when evaluation is False, skipping this task 41445 1727204216.02539: _execute() done 41445 1727204216.02541: dumping result to json 41445 1727204216.02543: done dumping result, returning 41445 1727204216.02553: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [028d2410-947f-bf02-eee4-0000000007d2] 41445 1727204216.02580: sending task result for task 028d2410-947f-bf02-eee4-0000000007d2 41445 1727204216.03241: done sending task result for task 028d2410-947f-bf02-eee4-0000000007d2 41445 1727204216.03244: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 41445 1727204216.03300: no more pending results, returning what we have 41445 1727204216.03303: results queue empty 41445 1727204216.03304: checking for any_errors_fatal 41445 1727204216.03310: done checking for any_errors_fatal 41445 1727204216.03311: checking for max_fail_percentage 41445 1727204216.03313: done checking for max_fail_percentage 41445 1727204216.03314: checking to see if all hosts have failed and the running result is not ok 41445 1727204216.03315: done checking to see if all hosts have failed 41445 1727204216.03316: getting the remaining hosts for this loop 41445 1727204216.03317: done getting the remaining hosts for this loop 41445 1727204216.03321: getting the next task for host managed-node3 41445 1727204216.03330: done getting next task for host managed-node3 41445 1727204216.03333: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 41445 1727204216.03336: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204216.03349: getting variables 41445 1727204216.03351: in VariableManager get_vars() 41445 1727204216.03506: Calling all_inventory to load vars for managed-node3 41445 1727204216.03510: Calling groups_inventory to load vars for managed-node3 41445 1727204216.03512: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204216.03522: Calling all_plugins_play to load vars for managed-node3 41445 1727204216.03525: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204216.03528: Calling groups_plugins_play to load vars for managed-node3 41445 1727204216.08129: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204216.11697: done with get_vars() 41445 1727204216.11738: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:56:56 -0400 (0:00:00.150) 0:00:34.908 ***** 41445 1727204216.12067: entering _queue_task() for managed-node3/service_facts 41445 1727204216.13141: worker is 1 (out of 1 available) 41445 1727204216.13156: exiting _queue_task() for managed-node3/service_facts 41445 1727204216.13167: done queuing things up, now waiting for results queue to drain 41445 1727204216.13168: waiting for pending results... 41445 1727204216.13709: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which services are running 41445 1727204216.14066: in run() - task 028d2410-947f-bf02-eee4-0000000007d4 41445 1727204216.14069: variable 'ansible_search_path' from source: unknown 41445 1727204216.14072: variable 'ansible_search_path' from source: unknown 41445 1727204216.14094: calling self._execute() 41445 1727204216.14278: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204216.14366: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204216.14393: variable 'omit' from source: magic vars 41445 1727204216.15260: variable 'ansible_distribution_major_version' from source: facts 41445 1727204216.15264: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204216.15267: variable 'omit' from source: magic vars 41445 1727204216.15425: variable 'omit' from source: magic vars 41445 1727204216.15463: variable 'omit' from source: magic vars 41445 1727204216.15564: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41445 1727204216.15667: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41445 1727204216.15859: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41445 1727204216.15863: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204216.16381: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204216.16385: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41445 1727204216.16388: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204216.16390: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204216.16446: Set connection var ansible_shell_executable to /bin/sh 41445 1727204216.16600: Set connection var ansible_shell_type to sh 41445 1727204216.16620: Set connection var ansible_pipelining to False 41445 1727204216.16654: Set connection var ansible_timeout to 10 41445 1727204216.16683: Set connection var ansible_connection to ssh 41445 1727204216.16697: Set connection var ansible_module_compression to ZIP_DEFLATED 41445 1727204216.16942: variable 'ansible_shell_executable' from source: unknown 41445 1727204216.16946: variable 'ansible_connection' from source: unknown 41445 1727204216.16949: variable 'ansible_module_compression' from source: unknown 41445 1727204216.16951: variable 'ansible_shell_type' from source: unknown 41445 1727204216.16953: variable 'ansible_shell_executable' from source: unknown 41445 1727204216.16955: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204216.16957: variable 'ansible_pipelining' from source: unknown 41445 1727204216.16959: variable 'ansible_timeout' from source: unknown 41445 1727204216.16961: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204216.17403: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 41445 1727204216.17408: variable 'omit' from source: magic vars 41445 1727204216.17412: starting attempt loop 41445 1727204216.17415: running the handler 41445 1727204216.17417: _low_level_execute_command(): starting 41445 1727204216.17420: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41445 1727204216.18873: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204216.18900: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204216.18924: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204216.18999: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204216.20698: stdout chunk (state=3): >>>/root <<< 41445 1727204216.20895: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204216.20898: stdout chunk (state=3): >>><<< 41445 1727204216.20903: stderr chunk (state=3): >>><<< 41445 1727204216.20907: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204216.20912: _low_level_execute_command(): starting 41445 1727204216.20945: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204216.2087748-43578-180233549161064 `" && echo ansible-tmp-1727204216.2087748-43578-180233549161064="` echo /root/.ansible/tmp/ansible-tmp-1727204216.2087748-43578-180233549161064 `" ) && sleep 0' 41445 1727204216.21788: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204216.21882: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204216.21886: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204216.21889: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204216.22179: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204216.22190: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204216.22193: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204216.22196: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204216.22215: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204216.24112: stdout chunk (state=3): >>>ansible-tmp-1727204216.2087748-43578-180233549161064=/root/.ansible/tmp/ansible-tmp-1727204216.2087748-43578-180233549161064 <<< 41445 1727204216.24382: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204216.24386: stdout chunk (state=3): >>><<< 41445 1727204216.24388: stderr chunk (state=3): >>><<< 41445 1727204216.24392: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204216.2087748-43578-180233549161064=/root/.ansible/tmp/ansible-tmp-1727204216.2087748-43578-180233549161064 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204216.24395: variable 'ansible_module_compression' from source: unknown 41445 1727204216.24397: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-414450s0ylvj1/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 41445 1727204216.24407: variable 'ansible_facts' from source: unknown 41445 1727204216.24515: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204216.2087748-43578-180233549161064/AnsiballZ_service_facts.py 41445 1727204216.24739: Sending initial data 41445 1727204216.24742: Sent initial data (162 bytes) 41445 1727204216.25416: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 <<< 41445 1727204216.25492: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204216.25535: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204216.25550: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204216.25567: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204216.25666: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204216.27388: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41445 1727204216.27393: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41445 1727204216.27423: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-414450s0ylvj1/tmpnr51lzns /root/.ansible/tmp/ansible-tmp-1727204216.2087748-43578-180233549161064/AnsiballZ_service_facts.py <<< 41445 1727204216.27429: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204216.2087748-43578-180233549161064/AnsiballZ_service_facts.py" <<< 41445 1727204216.27496: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-414450s0ylvj1/tmpnr51lzns" to remote "/root/.ansible/tmp/ansible-tmp-1727204216.2087748-43578-180233549161064/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204216.2087748-43578-180233549161064/AnsiballZ_service_facts.py" <<< 41445 1727204216.28300: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204216.28372: stderr chunk (state=3): >>><<< 41445 1727204216.28377: stdout chunk (state=3): >>><<< 41445 1727204216.28398: done transferring module to remote 41445 1727204216.28581: _low_level_execute_command(): starting 41445 1727204216.28584: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204216.2087748-43578-180233549161064/ /root/.ansible/tmp/ansible-tmp-1727204216.2087748-43578-180233549161064/AnsiballZ_service_facts.py && sleep 0' 41445 1727204216.29079: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204216.29089: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204216.29196: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204216.29225: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204216.29283: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204216.31328: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204216.31338: stdout chunk (state=3): >>><<< 41445 1727204216.31340: stderr chunk (state=3): >>><<< 41445 1727204216.31362: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204216.31366: _low_level_execute_command(): starting 41445 1727204216.31368: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204216.2087748-43578-180233549161064/AnsiballZ_service_facts.py && sleep 0' 41445 1727204216.31963: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204216.31970: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204216.31989: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204216.32094: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204216.32120: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204216.32400: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204217.80952: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source":<<< 41445 1727204217.81050: stdout chunk (state=3): >>> "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 41445 1727204217.82471: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. <<< 41445 1727204217.82501: stderr chunk (state=3): >>><<< 41445 1727204217.82504: stdout chunk (state=3): >>><<< 41445 1727204217.82534: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. 41445 1727204217.82978: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204216.2087748-43578-180233549161064/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41445 1727204217.82987: _low_level_execute_command(): starting 41445 1727204217.82992: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204216.2087748-43578-180233549161064/ > /dev/null 2>&1 && sleep 0' 41445 1727204217.83448: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204217.83452: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204217.83455: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204217.83457: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204217.83501: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204217.83516: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204217.83558: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204217.85294: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204217.85317: stderr chunk (state=3): >>><<< 41445 1727204217.85320: stdout chunk (state=3): >>><<< 41445 1727204217.85333: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204217.85339: handler run complete 41445 1727204217.85467: variable 'ansible_facts' from source: unknown 41445 1727204217.85565: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204217.85848: variable 'ansible_facts' from source: unknown 41445 1727204217.85931: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204217.86047: attempt loop complete, returning result 41445 1727204217.86051: _execute() done 41445 1727204217.86054: dumping result to json 41445 1727204217.86092: done dumping result, returning 41445 1727204217.86101: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which services are running [028d2410-947f-bf02-eee4-0000000007d4] 41445 1727204217.86106: sending task result for task 028d2410-947f-bf02-eee4-0000000007d4 ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 41445 1727204217.86737: no more pending results, returning what we have 41445 1727204217.86740: results queue empty 41445 1727204217.86741: checking for any_errors_fatal 41445 1727204217.86746: done checking for any_errors_fatal 41445 1727204217.86747: checking for max_fail_percentage 41445 1727204217.86748: done checking for max_fail_percentage 41445 1727204217.86749: checking to see if all hosts have failed and the running result is not ok 41445 1727204217.86750: done checking to see if all hosts have failed 41445 1727204217.86751: getting the remaining hosts for this loop 41445 1727204217.86752: done getting the remaining hosts for this loop 41445 1727204217.86755: getting the next task for host managed-node3 41445 1727204217.86760: done getting next task for host managed-node3 41445 1727204217.86763: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 41445 1727204217.86764: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204217.86772: done sending task result for task 028d2410-947f-bf02-eee4-0000000007d4 41445 1727204217.86777: WORKER PROCESS EXITING 41445 1727204217.86783: getting variables 41445 1727204217.86784: in VariableManager get_vars() 41445 1727204217.86815: Calling all_inventory to load vars for managed-node3 41445 1727204217.86817: Calling groups_inventory to load vars for managed-node3 41445 1727204217.86818: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204217.86825: Calling all_plugins_play to load vars for managed-node3 41445 1727204217.86827: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204217.86829: Calling groups_plugins_play to load vars for managed-node3 41445 1727204217.87832: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204217.89514: done with get_vars() 41445 1727204217.89550: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:56:57 -0400 (0:00:01.775) 0:00:36.684 ***** 41445 1727204217.89655: entering _queue_task() for managed-node3/package_facts 41445 1727204217.90073: worker is 1 (out of 1 available) 41445 1727204217.90090: exiting _queue_task() for managed-node3/package_facts 41445 1727204217.90104: done queuing things up, now waiting for results queue to drain 41445 1727204217.90105: waiting for pending results... 41445 1727204217.90414: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 41445 1727204217.90522: in run() - task 028d2410-947f-bf02-eee4-0000000007d5 41445 1727204217.90538: variable 'ansible_search_path' from source: unknown 41445 1727204217.90542: variable 'ansible_search_path' from source: unknown 41445 1727204217.90578: calling self._execute() 41445 1727204217.90684: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204217.90688: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204217.90707: variable 'omit' from source: magic vars 41445 1727204217.91155: variable 'ansible_distribution_major_version' from source: facts 41445 1727204217.91159: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204217.91162: variable 'omit' from source: magic vars 41445 1727204217.91166: variable 'omit' from source: magic vars 41445 1727204217.91204: variable 'omit' from source: magic vars 41445 1727204217.91261: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41445 1727204217.91290: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41445 1727204217.91309: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41445 1727204217.91330: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204217.91342: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204217.91378: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41445 1727204217.91381: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204217.91383: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204217.91584: Set connection var ansible_shell_executable to /bin/sh 41445 1727204217.91587: Set connection var ansible_shell_type to sh 41445 1727204217.91590: Set connection var ansible_pipelining to False 41445 1727204217.91592: Set connection var ansible_timeout to 10 41445 1727204217.91594: Set connection var ansible_connection to ssh 41445 1727204217.91597: Set connection var ansible_module_compression to ZIP_DEFLATED 41445 1727204217.91599: variable 'ansible_shell_executable' from source: unknown 41445 1727204217.91601: variable 'ansible_connection' from source: unknown 41445 1727204217.91604: variable 'ansible_module_compression' from source: unknown 41445 1727204217.91606: variable 'ansible_shell_type' from source: unknown 41445 1727204217.91608: variable 'ansible_shell_executable' from source: unknown 41445 1727204217.91610: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204217.91612: variable 'ansible_pipelining' from source: unknown 41445 1727204217.91614: variable 'ansible_timeout' from source: unknown 41445 1727204217.91616: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204217.92062: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 41445 1727204217.92067: variable 'omit' from source: magic vars 41445 1727204217.92069: starting attempt loop 41445 1727204217.92072: running the handler 41445 1727204217.92074: _low_level_execute_command(): starting 41445 1727204217.92078: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41445 1727204217.93389: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204217.93394: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204217.93425: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204217.93470: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204217.93498: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204217.95103: stdout chunk (state=3): >>>/root <<< 41445 1727204217.95255: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204217.95290: stderr chunk (state=3): >>><<< 41445 1727204217.95294: stdout chunk (state=3): >>><<< 41445 1727204217.95411: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204217.95417: _low_level_execute_command(): starting 41445 1727204217.95420: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204217.9531791-43675-142970559098842 `" && echo ansible-tmp-1727204217.9531791-43675-142970559098842="` echo /root/.ansible/tmp/ansible-tmp-1727204217.9531791-43675-142970559098842 `" ) && sleep 0' 41445 1727204217.96503: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204217.96507: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204217.96510: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204217.96513: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204217.96515: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 <<< 41445 1727204217.96517: stderr chunk (state=3): >>>debug2: match not found <<< 41445 1727204217.96527: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204217.96530: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41445 1727204217.96532: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.22 is address <<< 41445 1727204217.96533: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41445 1727204217.96535: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204217.96537: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204217.96593: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204217.96693: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204217.96992: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204217.98681: stdout chunk (state=3): >>>ansible-tmp-1727204217.9531791-43675-142970559098842=/root/.ansible/tmp/ansible-tmp-1727204217.9531791-43675-142970559098842 <<< 41445 1727204217.98779: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204217.98843: stderr chunk (state=3): >>><<< 41445 1727204217.98846: stdout chunk (state=3): >>><<< 41445 1727204217.98862: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204217.9531791-43675-142970559098842=/root/.ansible/tmp/ansible-tmp-1727204217.9531791-43675-142970559098842 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204217.99081: variable 'ansible_module_compression' from source: unknown 41445 1727204217.99094: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-414450s0ylvj1/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 41445 1727204217.99272: variable 'ansible_facts' from source: unknown 41445 1727204217.99689: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204217.9531791-43675-142970559098842/AnsiballZ_package_facts.py 41445 1727204218.00034: Sending initial data 41445 1727204218.00103: Sent initial data (162 bytes) 41445 1727204218.01401: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204218.01511: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204218.01530: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204218.01614: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204218.01746: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204218.01771: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204218.01860: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204218.03505: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41445 1727204218.03647: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41445 1727204218.03689: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-414450s0ylvj1/tmpoezw9s7_ /root/.ansible/tmp/ansible-tmp-1727204217.9531791-43675-142970559098842/AnsiballZ_package_facts.py <<< 41445 1727204218.03692: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204217.9531791-43675-142970559098842/AnsiballZ_package_facts.py" <<< 41445 1727204218.03724: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-414450s0ylvj1/tmpoezw9s7_" to remote "/root/.ansible/tmp/ansible-tmp-1727204217.9531791-43675-142970559098842/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204217.9531791-43675-142970559098842/AnsiballZ_package_facts.py" <<< 41445 1727204218.07132: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204218.07136: stderr chunk (state=3): >>><<< 41445 1727204218.07138: stdout chunk (state=3): >>><<< 41445 1727204218.07140: done transferring module to remote 41445 1727204218.07142: _low_level_execute_command(): starting 41445 1727204218.07144: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204217.9531791-43675-142970559098842/ /root/.ansible/tmp/ansible-tmp-1727204217.9531791-43675-142970559098842/AnsiballZ_package_facts.py && sleep 0' 41445 1727204218.08048: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204218.08056: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204218.08066: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204218.08091: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204218.08186: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204218.08201: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204218.08224: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204218.08272: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204218.10281: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204218.10284: stdout chunk (state=3): >>><<< 41445 1727204218.10287: stderr chunk (state=3): >>><<< 41445 1727204218.10290: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204218.10294: _low_level_execute_command(): starting 41445 1727204218.10296: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204217.9531791-43675-142970559098842/AnsiballZ_package_facts.py && sleep 0' 41445 1727204218.11593: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204218.11705: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204218.11784: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204218.11787: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204218.11812: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204218.56016: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 41445 1727204218.56039: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 41445 1727204218.56110: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10",<<< 41445 1727204218.56127: stdout chunk (state=3): >>> "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "<<< 41445 1727204218.56223: stdout chunk (state=3): >>>x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch",<<< 41445 1727204218.56236: stdout chunk (state=3): >>> "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch<<< 41445 1727204218.56257: stdout chunk (state=3): >>>": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 41445 1727204218.57924: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. <<< 41445 1727204218.57955: stderr chunk (state=3): >>><<< 41445 1727204218.57959: stdout chunk (state=3): >>><<< 41445 1727204218.57997: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. 41445 1727204218.59770: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204217.9531791-43675-142970559098842/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41445 1727204218.59777: _low_level_execute_command(): starting 41445 1727204218.59780: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204217.9531791-43675-142970559098842/ > /dev/null 2>&1 && sleep 0' 41445 1727204218.60220: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204218.60227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204218.60245: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204218.60248: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204218.60313: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204218.60317: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204218.60322: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204218.60344: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204218.62162: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204218.62168: stderr chunk (state=3): >>><<< 41445 1727204218.62181: stdout chunk (state=3): >>><<< 41445 1727204218.62202: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204218.62215: handler run complete 41445 1727204218.62823: variable 'ansible_facts' from source: unknown 41445 1727204218.63266: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204218.67300: variable 'ansible_facts' from source: unknown 41445 1727204218.68258: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204218.68709: attempt loop complete, returning result 41445 1727204218.68720: _execute() done 41445 1727204218.68723: dumping result to json 41445 1727204218.68881: done dumping result, returning 41445 1727204218.68890: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [028d2410-947f-bf02-eee4-0000000007d5] 41445 1727204218.68895: sending task result for task 028d2410-947f-bf02-eee4-0000000007d5 41445 1727204218.70444: done sending task result for task 028d2410-947f-bf02-eee4-0000000007d5 41445 1727204218.70447: WORKER PROCESS EXITING ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 41445 1727204218.70607: no more pending results, returning what we have 41445 1727204218.70612: results queue empty 41445 1727204218.70613: checking for any_errors_fatal 41445 1727204218.70620: done checking for any_errors_fatal 41445 1727204218.70621: checking for max_fail_percentage 41445 1727204218.70623: done checking for max_fail_percentage 41445 1727204218.70624: checking to see if all hosts have failed and the running result is not ok 41445 1727204218.70624: done checking to see if all hosts have failed 41445 1727204218.70625: getting the remaining hosts for this loop 41445 1727204218.70626: done getting the remaining hosts for this loop 41445 1727204218.70630: getting the next task for host managed-node3 41445 1727204218.70637: done getting next task for host managed-node3 41445 1727204218.70641: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 41445 1727204218.70643: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204218.70652: getting variables 41445 1727204218.70653: in VariableManager get_vars() 41445 1727204218.70680: Calling all_inventory to load vars for managed-node3 41445 1727204218.70682: Calling groups_inventory to load vars for managed-node3 41445 1727204218.70683: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204218.70690: Calling all_plugins_play to load vars for managed-node3 41445 1727204218.70691: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204218.70693: Calling groups_plugins_play to load vars for managed-node3 41445 1727204218.71857: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204218.73445: done with get_vars() 41445 1727204218.73485: done getting variables 41445 1727204218.73552: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:56:58 -0400 (0:00:00.839) 0:00:37.523 ***** 41445 1727204218.73592: entering _queue_task() for managed-node3/debug 41445 1727204218.73969: worker is 1 (out of 1 available) 41445 1727204218.74033: exiting _queue_task() for managed-node3/debug 41445 1727204218.74095: done queuing things up, now waiting for results queue to drain 41445 1727204218.74097: waiting for pending results... 41445 1727204218.74305: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider 41445 1727204218.74398: in run() - task 028d2410-947f-bf02-eee4-0000000000d8 41445 1727204218.74423: variable 'ansible_search_path' from source: unknown 41445 1727204218.74432: variable 'ansible_search_path' from source: unknown 41445 1727204218.74447: calling self._execute() 41445 1727204218.74580: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204218.74588: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204218.74591: variable 'omit' from source: magic vars 41445 1727204218.74966: variable 'ansible_distribution_major_version' from source: facts 41445 1727204218.74969: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204218.74971: variable 'omit' from source: magic vars 41445 1727204218.74989: variable 'omit' from source: magic vars 41445 1727204218.75073: variable 'network_provider' from source: set_fact 41445 1727204218.75101: variable 'omit' from source: magic vars 41445 1727204218.75154: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41445 1727204218.75186: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41445 1727204218.75211: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41445 1727204218.75242: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204218.75251: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204218.75273: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41445 1727204218.75278: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204218.75281: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204218.75353: Set connection var ansible_shell_executable to /bin/sh 41445 1727204218.75356: Set connection var ansible_shell_type to sh 41445 1727204218.75361: Set connection var ansible_pipelining to False 41445 1727204218.75368: Set connection var ansible_timeout to 10 41445 1727204218.75370: Set connection var ansible_connection to ssh 41445 1727204218.75378: Set connection var ansible_module_compression to ZIP_DEFLATED 41445 1727204218.75398: variable 'ansible_shell_executable' from source: unknown 41445 1727204218.75402: variable 'ansible_connection' from source: unknown 41445 1727204218.75404: variable 'ansible_module_compression' from source: unknown 41445 1727204218.75406: variable 'ansible_shell_type' from source: unknown 41445 1727204218.75409: variable 'ansible_shell_executable' from source: unknown 41445 1727204218.75411: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204218.75416: variable 'ansible_pipelining' from source: unknown 41445 1727204218.75418: variable 'ansible_timeout' from source: unknown 41445 1727204218.75431: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204218.75649: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41445 1727204218.75653: variable 'omit' from source: magic vars 41445 1727204218.75655: starting attempt loop 41445 1727204218.75658: running the handler 41445 1727204218.75673: handler run complete 41445 1727204218.75686: attempt loop complete, returning result 41445 1727204218.75696: _execute() done 41445 1727204218.75699: dumping result to json 41445 1727204218.75701: done dumping result, returning 41445 1727204218.75703: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider [028d2410-947f-bf02-eee4-0000000000d8] 41445 1727204218.75706: sending task result for task 028d2410-947f-bf02-eee4-0000000000d8 41445 1727204218.75816: done sending task result for task 028d2410-947f-bf02-eee4-0000000000d8 41445 1727204218.75829: WORKER PROCESS EXITING ok: [managed-node3] => {} MSG: Using network provider: nm 41445 1727204218.75908: no more pending results, returning what we have 41445 1727204218.75915: results queue empty 41445 1727204218.75916: checking for any_errors_fatal 41445 1727204218.75926: done checking for any_errors_fatal 41445 1727204218.75927: checking for max_fail_percentage 41445 1727204218.75928: done checking for max_fail_percentage 41445 1727204218.75929: checking to see if all hosts have failed and the running result is not ok 41445 1727204218.75930: done checking to see if all hosts have failed 41445 1727204218.75931: getting the remaining hosts for this loop 41445 1727204218.75932: done getting the remaining hosts for this loop 41445 1727204218.75936: getting the next task for host managed-node3 41445 1727204218.75941: done getting next task for host managed-node3 41445 1727204218.75946: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 41445 1727204218.75949: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204218.75960: getting variables 41445 1727204218.75962: in VariableManager get_vars() 41445 1727204218.75999: Calling all_inventory to load vars for managed-node3 41445 1727204218.76002: Calling groups_inventory to load vars for managed-node3 41445 1727204218.76004: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204218.76012: Calling all_plugins_play to load vars for managed-node3 41445 1727204218.76014: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204218.76092: Calling groups_plugins_play to load vars for managed-node3 41445 1727204218.77407: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204218.79050: done with get_vars() 41445 1727204218.79077: done getting variables 41445 1727204218.79136: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:56:58 -0400 (0:00:00.055) 0:00:37.579 ***** 41445 1727204218.79166: entering _queue_task() for managed-node3/fail 41445 1727204218.79481: worker is 1 (out of 1 available) 41445 1727204218.79493: exiting _queue_task() for managed-node3/fail 41445 1727204218.79505: done queuing things up, now waiting for results queue to drain 41445 1727204218.79506: waiting for pending results... 41445 1727204218.79896: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 41445 1727204218.79907: in run() - task 028d2410-947f-bf02-eee4-0000000000d9 41445 1727204218.79930: variable 'ansible_search_path' from source: unknown 41445 1727204218.79937: variable 'ansible_search_path' from source: unknown 41445 1727204218.79977: calling self._execute() 41445 1727204218.80087: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204218.80102: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204218.80122: variable 'omit' from source: magic vars 41445 1727204218.80533: variable 'ansible_distribution_major_version' from source: facts 41445 1727204218.80551: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204218.80663: variable 'network_state' from source: role '' defaults 41445 1727204218.80759: Evaluated conditional (network_state != {}): False 41445 1727204218.80763: when evaluation is False, skipping this task 41445 1727204218.80765: _execute() done 41445 1727204218.80767: dumping result to json 41445 1727204218.80770: done dumping result, returning 41445 1727204218.80772: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [028d2410-947f-bf02-eee4-0000000000d9] 41445 1727204218.80776: sending task result for task 028d2410-947f-bf02-eee4-0000000000d9 41445 1727204218.80855: done sending task result for task 028d2410-947f-bf02-eee4-0000000000d9 41445 1727204218.80858: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 41445 1727204218.80922: no more pending results, returning what we have 41445 1727204218.80926: results queue empty 41445 1727204218.80928: checking for any_errors_fatal 41445 1727204218.80936: done checking for any_errors_fatal 41445 1727204218.80936: checking for max_fail_percentage 41445 1727204218.80939: done checking for max_fail_percentage 41445 1727204218.80940: checking to see if all hosts have failed and the running result is not ok 41445 1727204218.80940: done checking to see if all hosts have failed 41445 1727204218.80941: getting the remaining hosts for this loop 41445 1727204218.80943: done getting the remaining hosts for this loop 41445 1727204218.80947: getting the next task for host managed-node3 41445 1727204218.80954: done getting next task for host managed-node3 41445 1727204218.80959: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 41445 1727204218.80962: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204218.80980: getting variables 41445 1727204218.80982: in VariableManager get_vars() 41445 1727204218.81028: Calling all_inventory to load vars for managed-node3 41445 1727204218.81031: Calling groups_inventory to load vars for managed-node3 41445 1727204218.81034: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204218.81048: Calling all_plugins_play to load vars for managed-node3 41445 1727204218.81051: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204218.81054: Calling groups_plugins_play to load vars for managed-node3 41445 1727204218.82862: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204218.84503: done with get_vars() 41445 1727204218.84534: done getting variables 41445 1727204218.84595: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:56:58 -0400 (0:00:00.054) 0:00:37.634 ***** 41445 1727204218.84629: entering _queue_task() for managed-node3/fail 41445 1727204218.85203: worker is 1 (out of 1 available) 41445 1727204218.85213: exiting _queue_task() for managed-node3/fail 41445 1727204218.85222: done queuing things up, now waiting for results queue to drain 41445 1727204218.85223: waiting for pending results... 41445 1727204218.85352: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 41445 1727204218.85407: in run() - task 028d2410-947f-bf02-eee4-0000000000da 41445 1727204218.85429: variable 'ansible_search_path' from source: unknown 41445 1727204218.85437: variable 'ansible_search_path' from source: unknown 41445 1727204218.85482: calling self._execute() 41445 1727204218.85666: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204218.85670: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204218.85673: variable 'omit' from source: magic vars 41445 1727204218.86016: variable 'ansible_distribution_major_version' from source: facts 41445 1727204218.86035: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204218.86151: variable 'network_state' from source: role '' defaults 41445 1727204218.86164: Evaluated conditional (network_state != {}): False 41445 1727204218.86170: when evaluation is False, skipping this task 41445 1727204218.86175: _execute() done 41445 1727204218.86183: dumping result to json 41445 1727204218.86189: done dumping result, returning 41445 1727204218.86197: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [028d2410-947f-bf02-eee4-0000000000da] 41445 1727204218.86213: sending task result for task 028d2410-947f-bf02-eee4-0000000000da skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 41445 1727204218.86364: no more pending results, returning what we have 41445 1727204218.86370: results queue empty 41445 1727204218.86371: checking for any_errors_fatal 41445 1727204218.86384: done checking for any_errors_fatal 41445 1727204218.86385: checking for max_fail_percentage 41445 1727204218.86388: done checking for max_fail_percentage 41445 1727204218.86389: checking to see if all hosts have failed and the running result is not ok 41445 1727204218.86390: done checking to see if all hosts have failed 41445 1727204218.86391: getting the remaining hosts for this loop 41445 1727204218.86392: done getting the remaining hosts for this loop 41445 1727204218.86397: getting the next task for host managed-node3 41445 1727204218.86404: done getting next task for host managed-node3 41445 1727204218.86408: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 41445 1727204218.86413: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204218.86431: getting variables 41445 1727204218.86433: in VariableManager get_vars() 41445 1727204218.86580: Calling all_inventory to load vars for managed-node3 41445 1727204218.86584: Calling groups_inventory to load vars for managed-node3 41445 1727204218.86587: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204218.86601: Calling all_plugins_play to load vars for managed-node3 41445 1727204218.86605: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204218.86607: Calling groups_plugins_play to load vars for managed-node3 41445 1727204218.87288: done sending task result for task 028d2410-947f-bf02-eee4-0000000000da 41445 1727204218.87292: WORKER PROCESS EXITING 41445 1727204218.88202: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204218.89679: done with get_vars() 41445 1727204218.89702: done getting variables 41445 1727204218.89750: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:56:58 -0400 (0:00:00.051) 0:00:37.685 ***** 41445 1727204218.89774: entering _queue_task() for managed-node3/fail 41445 1727204218.90032: worker is 1 (out of 1 available) 41445 1727204218.90045: exiting _queue_task() for managed-node3/fail 41445 1727204218.90056: done queuing things up, now waiting for results queue to drain 41445 1727204218.90057: waiting for pending results... 41445 1727204218.90243: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 41445 1727204218.90320: in run() - task 028d2410-947f-bf02-eee4-0000000000db 41445 1727204218.90334: variable 'ansible_search_path' from source: unknown 41445 1727204218.90338: variable 'ansible_search_path' from source: unknown 41445 1727204218.90366: calling self._execute() 41445 1727204218.90449: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204218.90453: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204218.90462: variable 'omit' from source: magic vars 41445 1727204218.90759: variable 'ansible_distribution_major_version' from source: facts 41445 1727204218.90769: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204218.90890: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41445 1727204218.93069: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41445 1727204218.93127: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41445 1727204218.93154: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41445 1727204218.93179: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41445 1727204218.93199: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41445 1727204218.93260: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204218.93281: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204218.93299: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204218.93331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204218.93343: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204218.93415: variable 'ansible_distribution_major_version' from source: facts 41445 1727204218.93430: Evaluated conditional (ansible_distribution_major_version | int > 9): True 41445 1727204218.93507: variable 'ansible_distribution' from source: facts 41445 1727204218.93513: variable '__network_rh_distros' from source: role '' defaults 41445 1727204218.93519: Evaluated conditional (ansible_distribution in __network_rh_distros): True 41445 1727204218.93681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204218.93699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204218.93717: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204218.93742: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204218.93754: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204218.93790: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204218.93806: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204218.93823: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204218.93848: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204218.93862: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204218.93893: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204218.93909: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204218.93927: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204218.93950: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204218.93960: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204218.94150: variable 'network_connections' from source: play vars 41445 1727204218.94159: variable 'profile' from source: play vars 41445 1727204218.94215: variable 'profile' from source: play vars 41445 1727204218.94219: variable 'interface' from source: set_fact 41445 1727204218.94258: variable 'interface' from source: set_fact 41445 1727204218.94267: variable 'network_state' from source: role '' defaults 41445 1727204218.94321: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41445 1727204218.94430: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41445 1727204218.94457: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41445 1727204218.94490: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41445 1727204218.94517: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41445 1727204218.94545: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41445 1727204218.94564: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41445 1727204218.94582: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204218.94620: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41445 1727204218.94639: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 41445 1727204218.94643: when evaluation is False, skipping this task 41445 1727204218.94645: _execute() done 41445 1727204218.94648: dumping result to json 41445 1727204218.94650: done dumping result, returning 41445 1727204218.94656: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [028d2410-947f-bf02-eee4-0000000000db] 41445 1727204218.94661: sending task result for task 028d2410-947f-bf02-eee4-0000000000db 41445 1727204218.94772: done sending task result for task 028d2410-947f-bf02-eee4-0000000000db 41445 1727204218.94775: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 41445 1727204218.94862: no more pending results, returning what we have 41445 1727204218.94865: results queue empty 41445 1727204218.94866: checking for any_errors_fatal 41445 1727204218.94871: done checking for any_errors_fatal 41445 1727204218.94872: checking for max_fail_percentage 41445 1727204218.94873: done checking for max_fail_percentage 41445 1727204218.94874: checking to see if all hosts have failed and the running result is not ok 41445 1727204218.95048: done checking to see if all hosts have failed 41445 1727204218.95050: getting the remaining hosts for this loop 41445 1727204218.95051: done getting the remaining hosts for this loop 41445 1727204218.95055: getting the next task for host managed-node3 41445 1727204218.95061: done getting next task for host managed-node3 41445 1727204218.95065: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 41445 1727204218.95067: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204218.95081: getting variables 41445 1727204218.95083: in VariableManager get_vars() 41445 1727204218.95122: Calling all_inventory to load vars for managed-node3 41445 1727204218.95125: Calling groups_inventory to load vars for managed-node3 41445 1727204218.95127: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204218.95137: Calling all_plugins_play to load vars for managed-node3 41445 1727204218.95140: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204218.95142: Calling groups_plugins_play to load vars for managed-node3 41445 1727204218.96522: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204218.97431: done with get_vars() 41445 1727204218.97448: done getting variables 41445 1727204218.97495: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:56:58 -0400 (0:00:00.077) 0:00:37.762 ***** 41445 1727204218.97519: entering _queue_task() for managed-node3/dnf 41445 1727204218.97769: worker is 1 (out of 1 available) 41445 1727204218.97785: exiting _queue_task() for managed-node3/dnf 41445 1727204218.97797: done queuing things up, now waiting for results queue to drain 41445 1727204218.97798: waiting for pending results... 41445 1727204218.98003: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 41445 1727204218.98097: in run() - task 028d2410-947f-bf02-eee4-0000000000dc 41445 1727204218.98109: variable 'ansible_search_path' from source: unknown 41445 1727204218.98112: variable 'ansible_search_path' from source: unknown 41445 1727204218.98177: calling self._execute() 41445 1727204218.98321: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204218.98325: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204218.98328: variable 'omit' from source: magic vars 41445 1727204218.98846: variable 'ansible_distribution_major_version' from source: facts 41445 1727204218.98850: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204218.98974: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41445 1727204219.01729: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41445 1727204219.01808: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41445 1727204219.01858: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41445 1727204219.01900: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41445 1727204219.01937: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41445 1727204219.02025: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204219.02062: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204219.02083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204219.02119: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204219.02134: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204219.02232: variable 'ansible_distribution' from source: facts 41445 1727204219.02235: variable 'ansible_distribution_major_version' from source: facts 41445 1727204219.02248: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 41445 1727204219.02338: variable '__network_wireless_connections_defined' from source: role '' defaults 41445 1727204219.02423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204219.02447: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204219.02461: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204219.02488: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204219.02501: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204219.02531: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204219.02553: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204219.02624: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204219.02628: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204219.02651: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204219.02884: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204219.02887: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204219.02895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204219.02922: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204219.02925: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204219.03277: variable 'network_connections' from source: play vars 41445 1727204219.03284: variable 'profile' from source: play vars 41445 1727204219.03287: variable 'profile' from source: play vars 41445 1727204219.03289: variable 'interface' from source: set_fact 41445 1727204219.03291: variable 'interface' from source: set_fact 41445 1727204219.03429: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41445 1727204219.03547: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41445 1727204219.03569: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41445 1727204219.03601: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41445 1727204219.03680: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41445 1727204219.03683: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41445 1727204219.03731: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41445 1727204219.03798: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204219.03808: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41445 1727204219.03859: variable '__network_team_connections_defined' from source: role '' defaults 41445 1727204219.04180: variable 'network_connections' from source: play vars 41445 1727204219.04183: variable 'profile' from source: play vars 41445 1727204219.04189: variable 'profile' from source: play vars 41445 1727204219.04205: variable 'interface' from source: set_fact 41445 1727204219.04290: variable 'interface' from source: set_fact 41445 1727204219.04408: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 41445 1727204219.04424: when evaluation is False, skipping this task 41445 1727204219.04581: _execute() done 41445 1727204219.04584: dumping result to json 41445 1727204219.04586: done dumping result, returning 41445 1727204219.04588: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [028d2410-947f-bf02-eee4-0000000000dc] 41445 1727204219.04590: sending task result for task 028d2410-947f-bf02-eee4-0000000000dc 41445 1727204219.04660: done sending task result for task 028d2410-947f-bf02-eee4-0000000000dc 41445 1727204219.04663: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 41445 1727204219.04719: no more pending results, returning what we have 41445 1727204219.04722: results queue empty 41445 1727204219.04723: checking for any_errors_fatal 41445 1727204219.04730: done checking for any_errors_fatal 41445 1727204219.04730: checking for max_fail_percentage 41445 1727204219.04732: done checking for max_fail_percentage 41445 1727204219.04733: checking to see if all hosts have failed and the running result is not ok 41445 1727204219.04734: done checking to see if all hosts have failed 41445 1727204219.04735: getting the remaining hosts for this loop 41445 1727204219.04736: done getting the remaining hosts for this loop 41445 1727204219.04740: getting the next task for host managed-node3 41445 1727204219.04747: done getting next task for host managed-node3 41445 1727204219.04751: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 41445 1727204219.04753: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204219.04767: getting variables 41445 1727204219.04769: in VariableManager get_vars() 41445 1727204219.05010: Calling all_inventory to load vars for managed-node3 41445 1727204219.05013: Calling groups_inventory to load vars for managed-node3 41445 1727204219.05015: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204219.05025: Calling all_plugins_play to load vars for managed-node3 41445 1727204219.05028: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204219.05035: Calling groups_plugins_play to load vars for managed-node3 41445 1727204219.06572: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204219.08151: done with get_vars() 41445 1727204219.08169: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 41445 1727204219.08228: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:56:59 -0400 (0:00:00.107) 0:00:37.870 ***** 41445 1727204219.08250: entering _queue_task() for managed-node3/yum 41445 1727204219.08505: worker is 1 (out of 1 available) 41445 1727204219.08520: exiting _queue_task() for managed-node3/yum 41445 1727204219.08531: done queuing things up, now waiting for results queue to drain 41445 1727204219.08533: waiting for pending results... 41445 1727204219.08714: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 41445 1727204219.08788: in run() - task 028d2410-947f-bf02-eee4-0000000000dd 41445 1727204219.08800: variable 'ansible_search_path' from source: unknown 41445 1727204219.08804: variable 'ansible_search_path' from source: unknown 41445 1727204219.08833: calling self._execute() 41445 1727204219.08987: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204219.08991: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204219.08996: variable 'omit' from source: magic vars 41445 1727204219.09206: variable 'ansible_distribution_major_version' from source: facts 41445 1727204219.09219: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204219.09344: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41445 1727204219.11282: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41445 1727204219.11290: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41445 1727204219.11331: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41445 1727204219.11369: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41445 1727204219.11396: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41445 1727204219.11487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204219.11523: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204219.11552: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204219.11598: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204219.11619: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204219.11711: variable 'ansible_distribution_major_version' from source: facts 41445 1727204219.11731: Evaluated conditional (ansible_distribution_major_version | int < 8): False 41445 1727204219.11737: when evaluation is False, skipping this task 41445 1727204219.11743: _execute() done 41445 1727204219.11748: dumping result to json 41445 1727204219.11753: done dumping result, returning 41445 1727204219.11762: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [028d2410-947f-bf02-eee4-0000000000dd] 41445 1727204219.11771: sending task result for task 028d2410-947f-bf02-eee4-0000000000dd skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 41445 1727204219.11927: no more pending results, returning what we have 41445 1727204219.11931: results queue empty 41445 1727204219.11932: checking for any_errors_fatal 41445 1727204219.11937: done checking for any_errors_fatal 41445 1727204219.11938: checking for max_fail_percentage 41445 1727204219.11940: done checking for max_fail_percentage 41445 1727204219.11941: checking to see if all hosts have failed and the running result is not ok 41445 1727204219.11942: done checking to see if all hosts have failed 41445 1727204219.11942: getting the remaining hosts for this loop 41445 1727204219.11943: done getting the remaining hosts for this loop 41445 1727204219.11947: getting the next task for host managed-node3 41445 1727204219.11953: done getting next task for host managed-node3 41445 1727204219.11957: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 41445 1727204219.11959: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204219.11971: getting variables 41445 1727204219.11973: in VariableManager get_vars() 41445 1727204219.12013: Calling all_inventory to load vars for managed-node3 41445 1727204219.12017: Calling groups_inventory to load vars for managed-node3 41445 1727204219.12019: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204219.12029: Calling all_plugins_play to load vars for managed-node3 41445 1727204219.12032: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204219.12035: Calling groups_plugins_play to load vars for managed-node3 41445 1727204219.12789: done sending task result for task 028d2410-947f-bf02-eee4-0000000000dd 41445 1727204219.12792: WORKER PROCESS EXITING 41445 1727204219.13469: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204219.14986: done with get_vars() 41445 1727204219.15013: done getting variables 41445 1727204219.15060: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:56:59 -0400 (0:00:00.068) 0:00:37.938 ***** 41445 1727204219.15085: entering _queue_task() for managed-node3/fail 41445 1727204219.15327: worker is 1 (out of 1 available) 41445 1727204219.15341: exiting _queue_task() for managed-node3/fail 41445 1727204219.15353: done queuing things up, now waiting for results queue to drain 41445 1727204219.15354: waiting for pending results... 41445 1727204219.15544: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 41445 1727204219.15619: in run() - task 028d2410-947f-bf02-eee4-0000000000de 41445 1727204219.15631: variable 'ansible_search_path' from source: unknown 41445 1727204219.15635: variable 'ansible_search_path' from source: unknown 41445 1727204219.15662: calling self._execute() 41445 1727204219.15748: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204219.15752: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204219.15762: variable 'omit' from source: magic vars 41445 1727204219.16043: variable 'ansible_distribution_major_version' from source: facts 41445 1727204219.16052: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204219.16140: variable '__network_wireless_connections_defined' from source: role '' defaults 41445 1727204219.16273: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41445 1727204219.17982: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41445 1727204219.18020: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41445 1727204219.18059: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41445 1727204219.18100: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41445 1727204219.18132: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41445 1727204219.18216: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204219.18282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204219.18286: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204219.18330: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204219.18350: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204219.18582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204219.18585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204219.18593: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204219.18596: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204219.18598: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204219.18638: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204219.18659: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204219.18688: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204219.18730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204219.18744: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204219.18993: variable 'network_connections' from source: play vars 41445 1727204219.19005: variable 'profile' from source: play vars 41445 1727204219.19091: variable 'profile' from source: play vars 41445 1727204219.19094: variable 'interface' from source: set_fact 41445 1727204219.19170: variable 'interface' from source: set_fact 41445 1727204219.19260: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41445 1727204219.19698: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41445 1727204219.19731: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41445 1727204219.19757: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41445 1727204219.19785: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41445 1727204219.19815: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41445 1727204219.19832: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41445 1727204219.19849: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204219.19867: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41445 1727204219.19910: variable '__network_team_connections_defined' from source: role '' defaults 41445 1727204219.20067: variable 'network_connections' from source: play vars 41445 1727204219.20070: variable 'profile' from source: play vars 41445 1727204219.20120: variable 'profile' from source: play vars 41445 1727204219.20123: variable 'interface' from source: set_fact 41445 1727204219.20165: variable 'interface' from source: set_fact 41445 1727204219.20184: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 41445 1727204219.20188: when evaluation is False, skipping this task 41445 1727204219.20190: _execute() done 41445 1727204219.20193: dumping result to json 41445 1727204219.20195: done dumping result, returning 41445 1727204219.20202: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [028d2410-947f-bf02-eee4-0000000000de] 41445 1727204219.20214: sending task result for task 028d2410-947f-bf02-eee4-0000000000de 41445 1727204219.20296: done sending task result for task 028d2410-947f-bf02-eee4-0000000000de 41445 1727204219.20299: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 41445 1727204219.20373: no more pending results, returning what we have 41445 1727204219.20379: results queue empty 41445 1727204219.20380: checking for any_errors_fatal 41445 1727204219.20385: done checking for any_errors_fatal 41445 1727204219.20386: checking for max_fail_percentage 41445 1727204219.20387: done checking for max_fail_percentage 41445 1727204219.20388: checking to see if all hosts have failed and the running result is not ok 41445 1727204219.20389: done checking to see if all hosts have failed 41445 1727204219.20389: getting the remaining hosts for this loop 41445 1727204219.20391: done getting the remaining hosts for this loop 41445 1727204219.20394: getting the next task for host managed-node3 41445 1727204219.20400: done getting next task for host managed-node3 41445 1727204219.20404: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 41445 1727204219.20406: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204219.20419: getting variables 41445 1727204219.20421: in VariableManager get_vars() 41445 1727204219.20459: Calling all_inventory to load vars for managed-node3 41445 1727204219.20462: Calling groups_inventory to load vars for managed-node3 41445 1727204219.20464: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204219.20473: Calling all_plugins_play to load vars for managed-node3 41445 1727204219.20478: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204219.20480: Calling groups_plugins_play to load vars for managed-node3 41445 1727204219.21499: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204219.24020: done with get_vars() 41445 1727204219.24055: done getting variables 41445 1727204219.24122: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:56:59 -0400 (0:00:00.090) 0:00:38.029 ***** 41445 1727204219.24158: entering _queue_task() for managed-node3/package 41445 1727204219.24917: worker is 1 (out of 1 available) 41445 1727204219.24930: exiting _queue_task() for managed-node3/package 41445 1727204219.24942: done queuing things up, now waiting for results queue to drain 41445 1727204219.24943: waiting for pending results... 41445 1727204219.25634: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages 41445 1727204219.25818: in run() - task 028d2410-947f-bf02-eee4-0000000000df 41445 1727204219.25983: variable 'ansible_search_path' from source: unknown 41445 1727204219.25988: variable 'ansible_search_path' from source: unknown 41445 1727204219.26014: calling self._execute() 41445 1727204219.26341: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204219.26344: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204219.26347: variable 'omit' from source: magic vars 41445 1727204219.27298: variable 'ansible_distribution_major_version' from source: facts 41445 1727204219.27318: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204219.27543: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41445 1727204219.27845: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41445 1727204219.27906: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41445 1727204219.27949: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41445 1727204219.27997: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41445 1727204219.28122: variable 'network_packages' from source: role '' defaults 41445 1727204219.28243: variable '__network_provider_setup' from source: role '' defaults 41445 1727204219.28261: variable '__network_service_name_default_nm' from source: role '' defaults 41445 1727204219.28344: variable '__network_service_name_default_nm' from source: role '' defaults 41445 1727204219.28362: variable '__network_packages_default_nm' from source: role '' defaults 41445 1727204219.28484: variable '__network_packages_default_nm' from source: role '' defaults 41445 1727204219.28701: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41445 1727204219.33816: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41445 1727204219.34046: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41445 1727204219.34095: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41445 1727204219.34236: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41445 1727204219.34240: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41445 1727204219.34407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204219.34582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204219.34586: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204219.34701: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204219.34728: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204219.34882: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204219.34930: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204219.34990: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204219.35081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204219.35117: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204219.35887: variable '__network_packages_default_gobject_packages' from source: role '' defaults 41445 1727204219.35890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204219.36082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204219.36127: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204219.36255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204219.36274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204219.36491: variable 'ansible_python' from source: facts 41445 1727204219.36524: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 41445 1727204219.36736: variable '__network_wpa_supplicant_required' from source: role '' defaults 41445 1727204219.36902: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 41445 1727204219.37287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204219.37326: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204219.37443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204219.37584: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204219.37783: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204219.37956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204219.37968: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204219.38087: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204219.38140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204219.38328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204219.38789: variable 'network_connections' from source: play vars 41445 1727204219.38803: variable 'profile' from source: play vars 41445 1727204219.39304: variable 'profile' from source: play vars 41445 1727204219.39308: variable 'interface' from source: set_fact 41445 1727204219.39313: variable 'interface' from source: set_fact 41445 1727204219.39583: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41445 1727204219.39627: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41445 1727204219.39771: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204219.39882: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41445 1727204219.39971: variable '__network_wireless_connections_defined' from source: role '' defaults 41445 1727204219.40621: variable 'network_connections' from source: play vars 41445 1727204219.40632: variable 'profile' from source: play vars 41445 1727204219.40856: variable 'profile' from source: play vars 41445 1727204219.40869: variable 'interface' from source: set_fact 41445 1727204219.41160: variable 'interface' from source: set_fact 41445 1727204219.41163: variable '__network_packages_default_wireless' from source: role '' defaults 41445 1727204219.41300: variable '__network_wireless_connections_defined' from source: role '' defaults 41445 1727204219.42053: variable 'network_connections' from source: play vars 41445 1727204219.42064: variable 'profile' from source: play vars 41445 1727204219.42231: variable 'profile' from source: play vars 41445 1727204219.42245: variable 'interface' from source: set_fact 41445 1727204219.42469: variable 'interface' from source: set_fact 41445 1727204219.42530: variable '__network_packages_default_team' from source: role '' defaults 41445 1727204219.42702: variable '__network_team_connections_defined' from source: role '' defaults 41445 1727204219.43563: variable 'network_connections' from source: play vars 41445 1727204219.43566: variable 'profile' from source: play vars 41445 1727204219.43646: variable 'profile' from source: play vars 41445 1727204219.43695: variable 'interface' from source: set_fact 41445 1727204219.43997: variable 'interface' from source: set_fact 41445 1727204219.44106: variable '__network_service_name_default_initscripts' from source: role '' defaults 41445 1727204219.44190: variable '__network_service_name_default_initscripts' from source: role '' defaults 41445 1727204219.44194: variable '__network_packages_default_initscripts' from source: role '' defaults 41445 1727204219.44409: variable '__network_packages_default_initscripts' from source: role '' defaults 41445 1727204219.45050: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 41445 1727204219.46234: variable 'network_connections' from source: play vars 41445 1727204219.46416: variable 'profile' from source: play vars 41445 1727204219.46419: variable 'profile' from source: play vars 41445 1727204219.46421: variable 'interface' from source: set_fact 41445 1727204219.46556: variable 'interface' from source: set_fact 41445 1727204219.46570: variable 'ansible_distribution' from source: facts 41445 1727204219.46639: variable '__network_rh_distros' from source: role '' defaults 41445 1727204219.46649: variable 'ansible_distribution_major_version' from source: facts 41445 1727204219.46668: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 41445 1727204219.47181: variable 'ansible_distribution' from source: facts 41445 1727204219.47184: variable '__network_rh_distros' from source: role '' defaults 41445 1727204219.47186: variable 'ansible_distribution_major_version' from source: facts 41445 1727204219.47189: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 41445 1727204219.47380: variable 'ansible_distribution' from source: facts 41445 1727204219.47487: variable '__network_rh_distros' from source: role '' defaults 41445 1727204219.47509: variable 'ansible_distribution_major_version' from source: facts 41445 1727204219.47550: variable 'network_provider' from source: set_fact 41445 1727204219.47633: variable 'ansible_facts' from source: unknown 41445 1727204219.49266: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 41445 1727204219.49319: when evaluation is False, skipping this task 41445 1727204219.49328: _execute() done 41445 1727204219.49336: dumping result to json 41445 1727204219.49344: done dumping result, returning 41445 1727204219.49464: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages [028d2410-947f-bf02-eee4-0000000000df] 41445 1727204219.49468: sending task result for task 028d2410-947f-bf02-eee4-0000000000df 41445 1727204219.49720: done sending task result for task 028d2410-947f-bf02-eee4-0000000000df 41445 1727204219.49723: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 41445 1727204219.49788: no more pending results, returning what we have 41445 1727204219.49792: results queue empty 41445 1727204219.49793: checking for any_errors_fatal 41445 1727204219.49800: done checking for any_errors_fatal 41445 1727204219.49801: checking for max_fail_percentage 41445 1727204219.49802: done checking for max_fail_percentage 41445 1727204219.49803: checking to see if all hosts have failed and the running result is not ok 41445 1727204219.49804: done checking to see if all hosts have failed 41445 1727204219.49805: getting the remaining hosts for this loop 41445 1727204219.49806: done getting the remaining hosts for this loop 41445 1727204219.49812: getting the next task for host managed-node3 41445 1727204219.49819: done getting next task for host managed-node3 41445 1727204219.49824: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 41445 1727204219.49826: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204219.49842: getting variables 41445 1727204219.49844: in VariableManager get_vars() 41445 1727204219.49887: Calling all_inventory to load vars for managed-node3 41445 1727204219.49890: Calling groups_inventory to load vars for managed-node3 41445 1727204219.49892: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204219.49912: Calling all_plugins_play to load vars for managed-node3 41445 1727204219.49915: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204219.49919: Calling groups_plugins_play to load vars for managed-node3 41445 1727204219.53484: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204219.57747: done with get_vars() 41445 1727204219.57786: done getting variables 41445 1727204219.57857: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:56:59 -0400 (0:00:00.337) 0:00:38.366 ***** 41445 1727204219.57892: entering _queue_task() for managed-node3/package 41445 1727204219.58294: worker is 1 (out of 1 available) 41445 1727204219.58309: exiting _queue_task() for managed-node3/package 41445 1727204219.58323: done queuing things up, now waiting for results queue to drain 41445 1727204219.58325: waiting for pending results... 41445 1727204219.58712: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 41445 1727204219.58719: in run() - task 028d2410-947f-bf02-eee4-0000000000e0 41445 1727204219.58722: variable 'ansible_search_path' from source: unknown 41445 1727204219.58726: variable 'ansible_search_path' from source: unknown 41445 1727204219.58730: calling self._execute() 41445 1727204219.58809: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204219.58816: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204219.58824: variable 'omit' from source: magic vars 41445 1727204219.59195: variable 'ansible_distribution_major_version' from source: facts 41445 1727204219.59205: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204219.59350: variable 'network_state' from source: role '' defaults 41445 1727204219.59353: Evaluated conditional (network_state != {}): False 41445 1727204219.59356: when evaluation is False, skipping this task 41445 1727204219.59358: _execute() done 41445 1727204219.59360: dumping result to json 41445 1727204219.59362: done dumping result, returning 41445 1727204219.59365: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [028d2410-947f-bf02-eee4-0000000000e0] 41445 1727204219.59368: sending task result for task 028d2410-947f-bf02-eee4-0000000000e0 41445 1727204219.59521: done sending task result for task 028d2410-947f-bf02-eee4-0000000000e0 41445 1727204219.59524: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 41445 1727204219.59601: no more pending results, returning what we have 41445 1727204219.59604: results queue empty 41445 1727204219.59605: checking for any_errors_fatal 41445 1727204219.59613: done checking for any_errors_fatal 41445 1727204219.59613: checking for max_fail_percentage 41445 1727204219.59615: done checking for max_fail_percentage 41445 1727204219.59616: checking to see if all hosts have failed and the running result is not ok 41445 1727204219.59617: done checking to see if all hosts have failed 41445 1727204219.59617: getting the remaining hosts for this loop 41445 1727204219.59618: done getting the remaining hosts for this loop 41445 1727204219.59621: getting the next task for host managed-node3 41445 1727204219.59626: done getting next task for host managed-node3 41445 1727204219.59629: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 41445 1727204219.59631: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204219.59644: getting variables 41445 1727204219.59645: in VariableManager get_vars() 41445 1727204219.59680: Calling all_inventory to load vars for managed-node3 41445 1727204219.59683: Calling groups_inventory to load vars for managed-node3 41445 1727204219.59685: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204219.59693: Calling all_plugins_play to load vars for managed-node3 41445 1727204219.59695: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204219.59698: Calling groups_plugins_play to load vars for managed-node3 41445 1727204219.77691: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204219.79934: done with get_vars() 41445 1727204219.79967: done getting variables 41445 1727204219.80122: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:56:59 -0400 (0:00:00.222) 0:00:38.589 ***** 41445 1727204219.80155: entering _queue_task() for managed-node3/package 41445 1727204219.80572: worker is 1 (out of 1 available) 41445 1727204219.80587: exiting _queue_task() for managed-node3/package 41445 1727204219.80599: done queuing things up, now waiting for results queue to drain 41445 1727204219.80600: waiting for pending results... 41445 1727204219.81019: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 41445 1727204219.81161: in run() - task 028d2410-947f-bf02-eee4-0000000000e1 41445 1727204219.81165: variable 'ansible_search_path' from source: unknown 41445 1727204219.81169: variable 'ansible_search_path' from source: unknown 41445 1727204219.81221: calling self._execute() 41445 1727204219.81328: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204219.81436: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204219.81440: variable 'omit' from source: magic vars 41445 1727204219.82164: variable 'ansible_distribution_major_version' from source: facts 41445 1727204219.82331: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204219.82566: variable 'network_state' from source: role '' defaults 41445 1727204219.82584: Evaluated conditional (network_state != {}): False 41445 1727204219.82594: when evaluation is False, skipping this task 41445 1727204219.82665: _execute() done 41445 1727204219.82690: dumping result to json 41445 1727204219.82741: done dumping result, returning 41445 1727204219.82796: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [028d2410-947f-bf02-eee4-0000000000e1] 41445 1727204219.82867: sending task result for task 028d2410-947f-bf02-eee4-0000000000e1 41445 1727204219.82965: done sending task result for task 028d2410-947f-bf02-eee4-0000000000e1 skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 41445 1727204219.83034: no more pending results, returning what we have 41445 1727204219.83038: results queue empty 41445 1727204219.83040: checking for any_errors_fatal 41445 1727204219.83050: done checking for any_errors_fatal 41445 1727204219.83051: checking for max_fail_percentage 41445 1727204219.83052: done checking for max_fail_percentage 41445 1727204219.83053: checking to see if all hosts have failed and the running result is not ok 41445 1727204219.83054: done checking to see if all hosts have failed 41445 1727204219.83055: getting the remaining hosts for this loop 41445 1727204219.83056: done getting the remaining hosts for this loop 41445 1727204219.83060: getting the next task for host managed-node3 41445 1727204219.83068: done getting next task for host managed-node3 41445 1727204219.83076: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 41445 1727204219.83080: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204219.83098: getting variables 41445 1727204219.83100: in VariableManager get_vars() 41445 1727204219.83144: Calling all_inventory to load vars for managed-node3 41445 1727204219.83147: Calling groups_inventory to load vars for managed-node3 41445 1727204219.83152: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204219.83166: Calling all_plugins_play to load vars for managed-node3 41445 1727204219.83171: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204219.83542: Calling groups_plugins_play to load vars for managed-node3 41445 1727204219.84137: WORKER PROCESS EXITING 41445 1727204219.85433: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204219.87604: done with get_vars() 41445 1727204219.87630: done getting variables 41445 1727204219.87705: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:56:59 -0400 (0:00:00.075) 0:00:38.665 ***** 41445 1727204219.87737: entering _queue_task() for managed-node3/service 41445 1727204219.88222: worker is 1 (out of 1 available) 41445 1727204219.88236: exiting _queue_task() for managed-node3/service 41445 1727204219.88250: done queuing things up, now waiting for results queue to drain 41445 1727204219.88251: waiting for pending results... 41445 1727204219.88731: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 41445 1727204219.89184: in run() - task 028d2410-947f-bf02-eee4-0000000000e2 41445 1727204219.89365: variable 'ansible_search_path' from source: unknown 41445 1727204219.89369: variable 'ansible_search_path' from source: unknown 41445 1727204219.89371: calling self._execute() 41445 1727204219.89555: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204219.89591: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204219.89594: variable 'omit' from source: magic vars 41445 1727204219.90335: variable 'ansible_distribution_major_version' from source: facts 41445 1727204219.90357: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204219.90514: variable '__network_wireless_connections_defined' from source: role '' defaults 41445 1727204219.91009: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41445 1727204219.95519: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41445 1727204219.95722: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41445 1727204219.96001: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41445 1727204219.96004: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41445 1727204219.96006: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41445 1727204219.96009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204219.96012: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204219.96049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204219.96125: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204219.96147: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204219.96198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204219.96280: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204219.96311: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204219.96389: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204219.96407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204219.96600: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204219.96628: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204219.96664: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204219.96735: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204219.96757: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204219.97003: variable 'network_connections' from source: play vars 41445 1727204219.97081: variable 'profile' from source: play vars 41445 1727204219.97138: variable 'profile' from source: play vars 41445 1727204219.97378: variable 'interface' from source: set_fact 41445 1727204219.97384: variable 'interface' from source: set_fact 41445 1727204219.97387: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41445 1727204219.97508: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41445 1727204219.97550: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41445 1727204219.97596: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41445 1727204219.97634: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41445 1727204219.97682: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41445 1727204219.97711: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41445 1727204219.97773: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204219.97808: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41445 1727204219.97870: variable '__network_team_connections_defined' from source: role '' defaults 41445 1727204219.98121: variable 'network_connections' from source: play vars 41445 1727204219.98132: variable 'profile' from source: play vars 41445 1727204219.98202: variable 'profile' from source: play vars 41445 1727204219.98261: variable 'interface' from source: set_fact 41445 1727204219.98358: variable 'interface' from source: set_fact 41445 1727204219.98472: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 41445 1727204219.98477: when evaluation is False, skipping this task 41445 1727204219.98480: _execute() done 41445 1727204219.98482: dumping result to json 41445 1727204219.98489: done dumping result, returning 41445 1727204219.98491: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [028d2410-947f-bf02-eee4-0000000000e2] 41445 1727204219.98505: sending task result for task 028d2410-947f-bf02-eee4-0000000000e2 skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 41445 1727204219.98862: no more pending results, returning what we have 41445 1727204219.98865: results queue empty 41445 1727204219.98866: checking for any_errors_fatal 41445 1727204219.98874: done checking for any_errors_fatal 41445 1727204219.98877: checking for max_fail_percentage 41445 1727204219.98879: done checking for max_fail_percentage 41445 1727204219.98880: checking to see if all hosts have failed and the running result is not ok 41445 1727204219.98881: done checking to see if all hosts have failed 41445 1727204219.98881: getting the remaining hosts for this loop 41445 1727204219.98883: done getting the remaining hosts for this loop 41445 1727204219.98887: getting the next task for host managed-node3 41445 1727204219.98894: done getting next task for host managed-node3 41445 1727204219.98897: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 41445 1727204219.98900: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204219.98913: getting variables 41445 1727204219.98915: in VariableManager get_vars() 41445 1727204219.99108: Calling all_inventory to load vars for managed-node3 41445 1727204219.99112: Calling groups_inventory to load vars for managed-node3 41445 1727204219.99114: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204219.99190: done sending task result for task 028d2410-947f-bf02-eee4-0000000000e2 41445 1727204219.99193: WORKER PROCESS EXITING 41445 1727204219.99202: Calling all_plugins_play to load vars for managed-node3 41445 1727204219.99205: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204219.99208: Calling groups_plugins_play to load vars for managed-node3 41445 1727204220.02104: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204220.04950: done with get_vars() 41445 1727204220.04991: done getting variables 41445 1727204220.05055: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:57:00 -0400 (0:00:00.173) 0:00:38.838 ***** 41445 1727204220.05096: entering _queue_task() for managed-node3/service 41445 1727204220.05485: worker is 1 (out of 1 available) 41445 1727204220.05500: exiting _queue_task() for managed-node3/service 41445 1727204220.05514: done queuing things up, now waiting for results queue to drain 41445 1727204220.05515: waiting for pending results... 41445 1727204220.05810: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 41445 1727204220.05937: in run() - task 028d2410-947f-bf02-eee4-0000000000e3 41445 1727204220.06051: variable 'ansible_search_path' from source: unknown 41445 1727204220.06054: variable 'ansible_search_path' from source: unknown 41445 1727204220.06083: calling self._execute() 41445 1727204220.06221: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204220.06225: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204220.06228: variable 'omit' from source: magic vars 41445 1727204220.07015: variable 'ansible_distribution_major_version' from source: facts 41445 1727204220.07020: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204220.07126: variable 'network_provider' from source: set_fact 41445 1727204220.07138: variable 'network_state' from source: role '' defaults 41445 1727204220.07197: Evaluated conditional (network_provider == "nm" or network_state != {}): True 41445 1727204220.07425: variable 'omit' from source: magic vars 41445 1727204220.07428: variable 'omit' from source: magic vars 41445 1727204220.07430: variable 'network_service_name' from source: role '' defaults 41445 1727204220.07482: variable 'network_service_name' from source: role '' defaults 41445 1727204220.07622: variable '__network_provider_setup' from source: role '' defaults 41445 1727204220.07642: variable '__network_service_name_default_nm' from source: role '' defaults 41445 1727204220.07711: variable '__network_service_name_default_nm' from source: role '' defaults 41445 1727204220.07726: variable '__network_packages_default_nm' from source: role '' defaults 41445 1727204220.07798: variable '__network_packages_default_nm' from source: role '' defaults 41445 1727204220.08034: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41445 1727204220.11303: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41445 1727204220.11353: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41445 1727204220.11470: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41445 1727204220.11758: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41445 1727204220.11773: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41445 1727204220.11793: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204220.11830: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204220.11869: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204220.11920: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204220.11942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204220.12006: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204220.12111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204220.12141: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204220.12354: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204220.12358: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204220.12575: variable '__network_packages_default_gobject_packages' from source: role '' defaults 41445 1727204220.12709: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204220.12745: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204220.12778: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204220.12890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204220.12973: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204220.13285: variable 'ansible_python' from source: facts 41445 1727204220.13288: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 41445 1727204220.13309: variable '__network_wpa_supplicant_required' from source: role '' defaults 41445 1727204220.13412: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 41445 1727204220.13548: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204220.13580: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204220.13617: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204220.13660: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204220.13681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204220.13739: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204220.13808: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204220.13853: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204220.13938: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204220.13958: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204220.14122: variable 'network_connections' from source: play vars 41445 1727204220.14195: variable 'profile' from source: play vars 41445 1727204220.14281: variable 'profile' from source: play vars 41445 1727204220.14293: variable 'interface' from source: set_fact 41445 1727204220.14356: variable 'interface' from source: set_fact 41445 1727204220.14686: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41445 1727204220.14892: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41445 1727204220.14952: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41445 1727204220.15007: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41445 1727204220.15054: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41445 1727204220.15131: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41445 1727204220.15165: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41445 1727204220.15212: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204220.15298: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41445 1727204220.15327: variable '__network_wireless_connections_defined' from source: role '' defaults 41445 1727204220.15636: variable 'network_connections' from source: play vars 41445 1727204220.15647: variable 'profile' from source: play vars 41445 1727204220.15731: variable 'profile' from source: play vars 41445 1727204220.15742: variable 'interface' from source: set_fact 41445 1727204220.15809: variable 'interface' from source: set_fact 41445 1727204220.15850: variable '__network_packages_default_wireless' from source: role '' defaults 41445 1727204220.15935: variable '__network_wireless_connections_defined' from source: role '' defaults 41445 1727204220.16248: variable 'network_connections' from source: play vars 41445 1727204220.16270: variable 'profile' from source: play vars 41445 1727204220.16340: variable 'profile' from source: play vars 41445 1727204220.16380: variable 'interface' from source: set_fact 41445 1727204220.16435: variable 'interface' from source: set_fact 41445 1727204220.16464: variable '__network_packages_default_team' from source: role '' defaults 41445 1727204220.16560: variable '__network_team_connections_defined' from source: role '' defaults 41445 1727204220.16895: variable 'network_connections' from source: play vars 41445 1727204220.16922: variable 'profile' from source: play vars 41445 1727204220.16989: variable 'profile' from source: play vars 41445 1727204220.17031: variable 'interface' from source: set_fact 41445 1727204220.17081: variable 'interface' from source: set_fact 41445 1727204220.17156: variable '__network_service_name_default_initscripts' from source: role '' defaults 41445 1727204220.17226: variable '__network_service_name_default_initscripts' from source: role '' defaults 41445 1727204220.17239: variable '__network_packages_default_initscripts' from source: role '' defaults 41445 1727204220.17360: variable '__network_packages_default_initscripts' from source: role '' defaults 41445 1727204220.17539: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 41445 1727204220.18267: variable 'network_connections' from source: play vars 41445 1727204220.18284: variable 'profile' from source: play vars 41445 1727204220.18351: variable 'profile' from source: play vars 41445 1727204220.18361: variable 'interface' from source: set_fact 41445 1727204220.18441: variable 'interface' from source: set_fact 41445 1727204220.18496: variable 'ansible_distribution' from source: facts 41445 1727204220.18499: variable '__network_rh_distros' from source: role '' defaults 41445 1727204220.18501: variable 'ansible_distribution_major_version' from source: facts 41445 1727204220.18503: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 41445 1727204220.18685: variable 'ansible_distribution' from source: facts 41445 1727204220.18694: variable '__network_rh_distros' from source: role '' defaults 41445 1727204220.18705: variable 'ansible_distribution_major_version' from source: facts 41445 1727204220.18727: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 41445 1727204220.18931: variable 'ansible_distribution' from source: facts 41445 1727204220.18934: variable '__network_rh_distros' from source: role '' defaults 41445 1727204220.18936: variable 'ansible_distribution_major_version' from source: facts 41445 1727204220.18968: variable 'network_provider' from source: set_fact 41445 1727204220.19041: variable 'omit' from source: magic vars 41445 1727204220.19044: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41445 1727204220.19069: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41445 1727204220.19101: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41445 1727204220.19123: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204220.19139: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204220.19174: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41445 1727204220.19184: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204220.19190: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204220.19286: Set connection var ansible_shell_executable to /bin/sh 41445 1727204220.19294: Set connection var ansible_shell_type to sh 41445 1727204220.19367: Set connection var ansible_pipelining to False 41445 1727204220.19370: Set connection var ansible_timeout to 10 41445 1727204220.19372: Set connection var ansible_connection to ssh 41445 1727204220.19375: Set connection var ansible_module_compression to ZIP_DEFLATED 41445 1727204220.19379: variable 'ansible_shell_executable' from source: unknown 41445 1727204220.19381: variable 'ansible_connection' from source: unknown 41445 1727204220.19383: variable 'ansible_module_compression' from source: unknown 41445 1727204220.19385: variable 'ansible_shell_type' from source: unknown 41445 1727204220.19387: variable 'ansible_shell_executable' from source: unknown 41445 1727204220.19389: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204220.19401: variable 'ansible_pipelining' from source: unknown 41445 1727204220.19408: variable 'ansible_timeout' from source: unknown 41445 1727204220.19416: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204220.19532: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41445 1727204220.19547: variable 'omit' from source: magic vars 41445 1727204220.19556: starting attempt loop 41445 1727204220.19562: running the handler 41445 1727204220.19692: variable 'ansible_facts' from source: unknown 41445 1727204220.20483: _low_level_execute_command(): starting 41445 1727204220.20495: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41445 1727204220.21556: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204220.21767: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204220.21869: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204220.21944: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204220.21993: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204220.23671: stdout chunk (state=3): >>>/root <<< 41445 1727204220.23986: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204220.23990: stdout chunk (state=3): >>><<< 41445 1727204220.23992: stderr chunk (state=3): >>><<< 41445 1727204220.23995: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204220.24006: _low_level_execute_command(): starting 41445 1727204220.24202: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204220.2397127-43772-255342243735522 `" && echo ansible-tmp-1727204220.2397127-43772-255342243735522="` echo /root/.ansible/tmp/ansible-tmp-1727204220.2397127-43772-255342243735522 `" ) && sleep 0' 41445 1727204220.25421: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204220.25425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204220.25428: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204220.25430: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204220.25657: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204220.25715: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204220.26057: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204220.27628: stdout chunk (state=3): >>>ansible-tmp-1727204220.2397127-43772-255342243735522=/root/.ansible/tmp/ansible-tmp-1727204220.2397127-43772-255342243735522 <<< 41445 1727204220.27827: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204220.27856: stderr chunk (state=3): >>><<< 41445 1727204220.27864: stdout chunk (state=3): >>><<< 41445 1727204220.27888: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204220.2397127-43772-255342243735522=/root/.ansible/tmp/ansible-tmp-1727204220.2397127-43772-255342243735522 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204220.27927: variable 'ansible_module_compression' from source: unknown 41445 1727204220.28161: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-414450s0ylvj1/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 41445 1727204220.28289: variable 'ansible_facts' from source: unknown 41445 1727204220.28734: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204220.2397127-43772-255342243735522/AnsiballZ_systemd.py 41445 1727204220.28970: Sending initial data 41445 1727204220.28988: Sent initial data (156 bytes) 41445 1727204220.29793: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204220.29834: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204220.29844: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204220.29866: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204220.29964: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204220.31492: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 41445 1727204220.31506: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41445 1727204220.31529: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41445 1727204220.31561: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-414450s0ylvj1/tmpfh_fhk4b /root/.ansible/tmp/ansible-tmp-1727204220.2397127-43772-255342243735522/AnsiballZ_systemd.py <<< 41445 1727204220.31569: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204220.2397127-43772-255342243735522/AnsiballZ_systemd.py" <<< 41445 1727204220.31594: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-414450s0ylvj1/tmpfh_fhk4b" to remote "/root/.ansible/tmp/ansible-tmp-1727204220.2397127-43772-255342243735522/AnsiballZ_systemd.py" <<< 41445 1727204220.31601: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204220.2397127-43772-255342243735522/AnsiballZ_systemd.py" <<< 41445 1727204220.32953: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204220.32956: stderr chunk (state=3): >>><<< 41445 1727204220.33067: stdout chunk (state=3): >>><<< 41445 1727204220.33070: done transferring module to remote 41445 1727204220.33073: _low_level_execute_command(): starting 41445 1727204220.33081: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204220.2397127-43772-255342243735522/ /root/.ansible/tmp/ansible-tmp-1727204220.2397127-43772-255342243735522/AnsiballZ_systemd.py && sleep 0' 41445 1727204220.33727: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204220.33786: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204220.33806: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204220.33865: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204220.33884: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204220.33917: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204220.34148: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204220.35853: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204220.35863: stdout chunk (state=3): >>><<< 41445 1727204220.35873: stderr chunk (state=3): >>><<< 41445 1727204220.35901: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204220.35910: _low_level_execute_command(): starting 41445 1727204220.35929: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204220.2397127-43772-255342243735522/AnsiballZ_systemd.py && sleep 0' 41445 1727204220.36533: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204220.36547: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204220.36560: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204220.36620: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204220.36726: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204220.36788: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204220.36881: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204220.65534: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "704", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:44:11 EDT", "ExecMainStartTimestampMonotonic": "28990148", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Tue 2024-09-24 14:44:11 EDT", "ExecMainHandoffTimestampMonotonic": "29005881", "ExecMainPID": "704", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10489856", "MemoryPeak": "13586432", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3296247808", "EffectiveMemoryMax": "3702865920", "EffectiveMemoryHigh": "3702865920", "CPUUsageNSec": "1881158000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpR<<< 41445 1727204220.65540: stdout chunk (state=3): >>>eceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service shutdown.target NetworkManager-wait-online.service multi-user.target network.target", "After": "dbus-b<<< 41445 1727204220.65546: stdout chunk (state=3): >>>roker.service systemd-journald.socket network-pre.target basic.target cloud-init-local.service dbus.socket system.slice sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:49:45 EDT", "StateChangeTimestampMonotonic": "362725592", "InactiveExitTimestamp": "Tue 2024-09-24 14:44:11 EDT", "InactiveExitTimestampMonotonic": "28990654", "ActiveEnterTimestamp": "Tue 2024-09-24 14:44:12 EDT", "ActiveEnterTimestampMonotonic": "29769382", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:44:11 EDT", "ConditionTimestampMonotonic": "28989295", "AssertTimestamp": "Tue 2024-09-24 14:44:11 EDT", "AssertTimestampMonotonic": "28989297", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "70a845f8a1964db89963090ed497f47f", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 41445 1727204220.67583: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. <<< 41445 1727204220.67588: stdout chunk (state=3): >>><<< 41445 1727204220.67591: stderr chunk (state=3): >>><<< 41445 1727204220.67598: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "704", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:44:11 EDT", "ExecMainStartTimestampMonotonic": "28990148", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Tue 2024-09-24 14:44:11 EDT", "ExecMainHandoffTimestampMonotonic": "29005881", "ExecMainPID": "704", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10489856", "MemoryPeak": "13586432", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3296247808", "EffectiveMemoryMax": "3702865920", "EffectiveMemoryHigh": "3702865920", "CPUUsageNSec": "1881158000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service shutdown.target NetworkManager-wait-online.service multi-user.target network.target", "After": "dbus-broker.service systemd-journald.socket network-pre.target basic.target cloud-init-local.service dbus.socket system.slice sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:49:45 EDT", "StateChangeTimestampMonotonic": "362725592", "InactiveExitTimestamp": "Tue 2024-09-24 14:44:11 EDT", "InactiveExitTimestampMonotonic": "28990654", "ActiveEnterTimestamp": "Tue 2024-09-24 14:44:12 EDT", "ActiveEnterTimestampMonotonic": "29769382", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:44:11 EDT", "ConditionTimestampMonotonic": "28989295", "AssertTimestamp": "Tue 2024-09-24 14:44:11 EDT", "AssertTimestampMonotonic": "28989297", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "70a845f8a1964db89963090ed497f47f", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. 41445 1727204220.67618: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204220.2397127-43772-255342243735522/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41445 1727204220.67639: _low_level_execute_command(): starting 41445 1727204220.67642: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204220.2397127-43772-255342243735522/ > /dev/null 2>&1 && sleep 0' 41445 1727204220.68250: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204220.68262: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204220.68273: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204220.68294: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204220.68307: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 <<< 41445 1727204220.68315: stderr chunk (state=3): >>>debug2: match not found <<< 41445 1727204220.68323: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204220.68356: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204220.68435: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204220.68500: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204220.68507: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204220.70350: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204220.70354: stdout chunk (state=3): >>><<< 41445 1727204220.70377: stderr chunk (state=3): >>><<< 41445 1727204220.70381: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204220.70383: handler run complete 41445 1727204220.70499: attempt loop complete, returning result 41445 1727204220.70502: _execute() done 41445 1727204220.70504: dumping result to json 41445 1727204220.70506: done dumping result, returning 41445 1727204220.70508: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [028d2410-947f-bf02-eee4-0000000000e3] 41445 1727204220.70513: sending task result for task 028d2410-947f-bf02-eee4-0000000000e3 41445 1727204220.70978: done sending task result for task 028d2410-947f-bf02-eee4-0000000000e3 41445 1727204220.70981: WORKER PROCESS EXITING ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 41445 1727204220.71099: no more pending results, returning what we have 41445 1727204220.71104: results queue empty 41445 1727204220.71106: checking for any_errors_fatal 41445 1727204220.71113: done checking for any_errors_fatal 41445 1727204220.71114: checking for max_fail_percentage 41445 1727204220.71116: done checking for max_fail_percentage 41445 1727204220.71117: checking to see if all hosts have failed and the running result is not ok 41445 1727204220.71117: done checking to see if all hosts have failed 41445 1727204220.71118: getting the remaining hosts for this loop 41445 1727204220.71119: done getting the remaining hosts for this loop 41445 1727204220.71123: getting the next task for host managed-node3 41445 1727204220.71129: done getting next task for host managed-node3 41445 1727204220.71132: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 41445 1727204220.71141: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204220.71151: getting variables 41445 1727204220.71153: in VariableManager get_vars() 41445 1727204220.71198: Calling all_inventory to load vars for managed-node3 41445 1727204220.71201: Calling groups_inventory to load vars for managed-node3 41445 1727204220.71203: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204220.71215: Calling all_plugins_play to load vars for managed-node3 41445 1727204220.71218: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204220.71221: Calling groups_plugins_play to load vars for managed-node3 41445 1727204220.72537: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204220.73442: done with get_vars() 41445 1727204220.73459: done getting variables 41445 1727204220.73504: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:57:00 -0400 (0:00:00.684) 0:00:39.523 ***** 41445 1727204220.73528: entering _queue_task() for managed-node3/service 41445 1727204220.73783: worker is 1 (out of 1 available) 41445 1727204220.73797: exiting _queue_task() for managed-node3/service 41445 1727204220.73809: done queuing things up, now waiting for results queue to drain 41445 1727204220.73810: waiting for pending results... 41445 1727204220.74060: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 41445 1727204220.74260: in run() - task 028d2410-947f-bf02-eee4-0000000000e4 41445 1727204220.74265: variable 'ansible_search_path' from source: unknown 41445 1727204220.74268: variable 'ansible_search_path' from source: unknown 41445 1727204220.74270: calling self._execute() 41445 1727204220.74305: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204220.74309: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204220.74331: variable 'omit' from source: magic vars 41445 1727204220.74730: variable 'ansible_distribution_major_version' from source: facts 41445 1727204220.74784: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204220.74858: variable 'network_provider' from source: set_fact 41445 1727204220.74864: Evaluated conditional (network_provider == "nm"): True 41445 1727204220.74956: variable '__network_wpa_supplicant_required' from source: role '' defaults 41445 1727204220.75059: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 41445 1727204220.75238: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41445 1727204220.76765: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41445 1727204220.76818: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41445 1727204220.76845: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41445 1727204220.76869: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41445 1727204220.76898: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41445 1727204220.76978: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204220.76999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204220.77024: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204220.77050: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204220.77060: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204220.77094: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204220.77112: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204220.77134: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204220.77158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204220.77169: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204220.77199: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204220.77217: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204220.77238: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204220.77274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204220.77287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204220.77397: variable 'network_connections' from source: play vars 41445 1727204220.77406: variable 'profile' from source: play vars 41445 1727204220.77528: variable 'profile' from source: play vars 41445 1727204220.77531: variable 'interface' from source: set_fact 41445 1727204220.77589: variable 'interface' from source: set_fact 41445 1727204220.77623: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41445 1727204220.77790: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41445 1727204220.77816: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41445 1727204220.77846: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41445 1727204220.77982: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41445 1727204220.77986: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41445 1727204220.77989: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41445 1727204220.77991: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204220.77994: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41445 1727204220.78099: variable '__network_wireless_connections_defined' from source: role '' defaults 41445 1727204220.78348: variable 'network_connections' from source: play vars 41445 1727204220.78359: variable 'profile' from source: play vars 41445 1727204220.78432: variable 'profile' from source: play vars 41445 1727204220.78452: variable 'interface' from source: set_fact 41445 1727204220.78520: variable 'interface' from source: set_fact 41445 1727204220.78565: Evaluated conditional (__network_wpa_supplicant_required): False 41445 1727204220.78661: when evaluation is False, skipping this task 41445 1727204220.78664: _execute() done 41445 1727204220.78674: dumping result to json 41445 1727204220.78679: done dumping result, returning 41445 1727204220.78682: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [028d2410-947f-bf02-eee4-0000000000e4] 41445 1727204220.78684: sending task result for task 028d2410-947f-bf02-eee4-0000000000e4 41445 1727204220.78754: done sending task result for task 028d2410-947f-bf02-eee4-0000000000e4 41445 1727204220.78757: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 41445 1727204220.78815: no more pending results, returning what we have 41445 1727204220.78818: results queue empty 41445 1727204220.78819: checking for any_errors_fatal 41445 1727204220.78841: done checking for any_errors_fatal 41445 1727204220.78842: checking for max_fail_percentage 41445 1727204220.78844: done checking for max_fail_percentage 41445 1727204220.78844: checking to see if all hosts have failed and the running result is not ok 41445 1727204220.78845: done checking to see if all hosts have failed 41445 1727204220.78846: getting the remaining hosts for this loop 41445 1727204220.78847: done getting the remaining hosts for this loop 41445 1727204220.78851: getting the next task for host managed-node3 41445 1727204220.78858: done getting next task for host managed-node3 41445 1727204220.78862: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 41445 1727204220.78863: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204220.78881: getting variables 41445 1727204220.78883: in VariableManager get_vars() 41445 1727204220.78922: Calling all_inventory to load vars for managed-node3 41445 1727204220.78925: Calling groups_inventory to load vars for managed-node3 41445 1727204220.78928: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204220.78938: Calling all_plugins_play to load vars for managed-node3 41445 1727204220.78941: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204220.78943: Calling groups_plugins_play to load vars for managed-node3 41445 1727204220.81163: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204220.84663: done with get_vars() 41445 1727204220.84920: done getting variables 41445 1727204220.84991: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:57:00 -0400 (0:00:00.115) 0:00:39.638 ***** 41445 1727204220.85097: entering _queue_task() for managed-node3/service 41445 1727204220.85825: worker is 1 (out of 1 available) 41445 1727204220.85838: exiting _queue_task() for managed-node3/service 41445 1727204220.85852: done queuing things up, now waiting for results queue to drain 41445 1727204220.85854: waiting for pending results... 41445 1727204220.86639: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service 41445 1727204220.86652: in run() - task 028d2410-947f-bf02-eee4-0000000000e5 41445 1727204220.86656: variable 'ansible_search_path' from source: unknown 41445 1727204220.86658: variable 'ansible_search_path' from source: unknown 41445 1727204220.86686: calling self._execute() 41445 1727204220.86820: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204220.86824: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204220.86857: variable 'omit' from source: magic vars 41445 1727204220.87484: variable 'ansible_distribution_major_version' from source: facts 41445 1727204220.87488: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204220.87643: variable 'network_provider' from source: set_fact 41445 1727204220.87646: Evaluated conditional (network_provider == "initscripts"): False 41445 1727204220.87649: when evaluation is False, skipping this task 41445 1727204220.87651: _execute() done 41445 1727204220.87653: dumping result to json 41445 1727204220.87654: done dumping result, returning 41445 1727204220.87656: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service [028d2410-947f-bf02-eee4-0000000000e5] 41445 1727204220.87658: sending task result for task 028d2410-947f-bf02-eee4-0000000000e5 41445 1727204220.87841: done sending task result for task 028d2410-947f-bf02-eee4-0000000000e5 41445 1727204220.87845: WORKER PROCESS EXITING skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 41445 1727204220.87895: no more pending results, returning what we have 41445 1727204220.87899: results queue empty 41445 1727204220.87900: checking for any_errors_fatal 41445 1727204220.87910: done checking for any_errors_fatal 41445 1727204220.87911: checking for max_fail_percentage 41445 1727204220.87913: done checking for max_fail_percentage 41445 1727204220.87914: checking to see if all hosts have failed and the running result is not ok 41445 1727204220.87915: done checking to see if all hosts have failed 41445 1727204220.87916: getting the remaining hosts for this loop 41445 1727204220.87917: done getting the remaining hosts for this loop 41445 1727204220.87922: getting the next task for host managed-node3 41445 1727204220.87928: done getting next task for host managed-node3 41445 1727204220.87933: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 41445 1727204220.87935: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204220.87956: getting variables 41445 1727204220.87959: in VariableManager get_vars() 41445 1727204220.88195: Calling all_inventory to load vars for managed-node3 41445 1727204220.88239: Calling groups_inventory to load vars for managed-node3 41445 1727204220.88342: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204220.88385: Calling all_plugins_play to load vars for managed-node3 41445 1727204220.88390: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204220.88394: Calling groups_plugins_play to load vars for managed-node3 41445 1727204220.90425: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204220.92219: done with get_vars() 41445 1727204220.92254: done getting variables 41445 1727204220.92324: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:57:00 -0400 (0:00:00.072) 0:00:39.711 ***** 41445 1727204220.92357: entering _queue_task() for managed-node3/copy 41445 1727204220.92727: worker is 1 (out of 1 available) 41445 1727204220.92742: exiting _queue_task() for managed-node3/copy 41445 1727204220.92755: done queuing things up, now waiting for results queue to drain 41445 1727204220.92756: waiting for pending results... 41445 1727204220.93065: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 41445 1727204220.93255: in run() - task 028d2410-947f-bf02-eee4-0000000000e6 41445 1727204220.93267: variable 'ansible_search_path' from source: unknown 41445 1727204220.93271: variable 'ansible_search_path' from source: unknown 41445 1727204220.93274: calling self._execute() 41445 1727204220.93373: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204220.93491: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204220.93495: variable 'omit' from source: magic vars 41445 1727204220.93779: variable 'ansible_distribution_major_version' from source: facts 41445 1727204220.93790: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204220.93983: variable 'network_provider' from source: set_fact 41445 1727204220.93988: Evaluated conditional (network_provider == "initscripts"): False 41445 1727204220.93990: when evaluation is False, skipping this task 41445 1727204220.93993: _execute() done 41445 1727204220.93996: dumping result to json 41445 1727204220.93998: done dumping result, returning 41445 1727204220.94001: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [028d2410-947f-bf02-eee4-0000000000e6] 41445 1727204220.94003: sending task result for task 028d2410-947f-bf02-eee4-0000000000e6 skipping: [managed-node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 41445 1727204220.94129: no more pending results, returning what we have 41445 1727204220.94134: results queue empty 41445 1727204220.94137: checking for any_errors_fatal 41445 1727204220.94143: done checking for any_errors_fatal 41445 1727204220.94144: checking for max_fail_percentage 41445 1727204220.94146: done checking for max_fail_percentage 41445 1727204220.94147: checking to see if all hosts have failed and the running result is not ok 41445 1727204220.94148: done checking to see if all hosts have failed 41445 1727204220.94149: getting the remaining hosts for this loop 41445 1727204220.94150: done getting the remaining hosts for this loop 41445 1727204220.94155: getting the next task for host managed-node3 41445 1727204220.94162: done getting next task for host managed-node3 41445 1727204220.94166: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 41445 1727204220.94168: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204220.94196: done sending task result for task 028d2410-947f-bf02-eee4-0000000000e6 41445 1727204220.94199: WORKER PROCESS EXITING 41445 1727204220.94381: getting variables 41445 1727204220.94383: in VariableManager get_vars() 41445 1727204220.94419: Calling all_inventory to load vars for managed-node3 41445 1727204220.94422: Calling groups_inventory to load vars for managed-node3 41445 1727204220.94425: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204220.94434: Calling all_plugins_play to load vars for managed-node3 41445 1727204220.94437: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204220.94440: Calling groups_plugins_play to load vars for managed-node3 41445 1727204220.95896: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204220.97224: done with get_vars() 41445 1727204220.97243: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:57:00 -0400 (0:00:00.049) 0:00:39.760 ***** 41445 1727204220.97308: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 41445 1727204220.97570: worker is 1 (out of 1 available) 41445 1727204220.97586: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 41445 1727204220.97599: done queuing things up, now waiting for results queue to drain 41445 1727204220.97600: waiting for pending results... 41445 1727204220.97785: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 41445 1727204220.97863: in run() - task 028d2410-947f-bf02-eee4-0000000000e7 41445 1727204220.97878: variable 'ansible_search_path' from source: unknown 41445 1727204220.97882: variable 'ansible_search_path' from source: unknown 41445 1727204220.97912: calling self._execute() 41445 1727204220.98021: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204220.98026: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204220.98029: variable 'omit' from source: magic vars 41445 1727204220.98366: variable 'ansible_distribution_major_version' from source: facts 41445 1727204220.98370: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204220.98378: variable 'omit' from source: magic vars 41445 1727204220.98416: variable 'omit' from source: magic vars 41445 1727204220.98541: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41445 1727204221.00207: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41445 1727204221.00262: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41445 1727204221.00284: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41445 1727204221.00311: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41445 1727204221.00333: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41445 1727204221.00398: variable 'network_provider' from source: set_fact 41445 1727204221.00499: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41445 1727204221.00534: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41445 1727204221.00551: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41445 1727204221.00579: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41445 1727204221.00592: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41445 1727204221.00647: variable 'omit' from source: magic vars 41445 1727204221.00729: variable 'omit' from source: magic vars 41445 1727204221.00805: variable 'network_connections' from source: play vars 41445 1727204221.00811: variable 'profile' from source: play vars 41445 1727204221.00862: variable 'profile' from source: play vars 41445 1727204221.00865: variable 'interface' from source: set_fact 41445 1727204221.00912: variable 'interface' from source: set_fact 41445 1727204221.01010: variable 'omit' from source: magic vars 41445 1727204221.01019: variable '__lsr_ansible_managed' from source: task vars 41445 1727204221.01063: variable '__lsr_ansible_managed' from source: task vars 41445 1727204221.01252: Loaded config def from plugin (lookup/template) 41445 1727204221.01255: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 41445 1727204221.01279: File lookup term: get_ansible_managed.j2 41445 1727204221.01283: variable 'ansible_search_path' from source: unknown 41445 1727204221.01288: evaluation_path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 41445 1727204221.01299: search_path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 41445 1727204221.01312: variable 'ansible_search_path' from source: unknown 41445 1727204221.04956: variable 'ansible_managed' from source: unknown 41445 1727204221.05046: variable 'omit' from source: magic vars 41445 1727204221.05069: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41445 1727204221.05096: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41445 1727204221.05111: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41445 1727204221.05127: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204221.05136: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204221.05157: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41445 1727204221.05161: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204221.05163: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204221.05234: Set connection var ansible_shell_executable to /bin/sh 41445 1727204221.05237: Set connection var ansible_shell_type to sh 41445 1727204221.05240: Set connection var ansible_pipelining to False 41445 1727204221.05248: Set connection var ansible_timeout to 10 41445 1727204221.05250: Set connection var ansible_connection to ssh 41445 1727204221.05256: Set connection var ansible_module_compression to ZIP_DEFLATED 41445 1727204221.05274: variable 'ansible_shell_executable' from source: unknown 41445 1727204221.05279: variable 'ansible_connection' from source: unknown 41445 1727204221.05282: variable 'ansible_module_compression' from source: unknown 41445 1727204221.05285: variable 'ansible_shell_type' from source: unknown 41445 1727204221.05288: variable 'ansible_shell_executable' from source: unknown 41445 1727204221.05290: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204221.05294: variable 'ansible_pipelining' from source: unknown 41445 1727204221.05296: variable 'ansible_timeout' from source: unknown 41445 1727204221.05298: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204221.05402: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 41445 1727204221.05589: variable 'omit' from source: magic vars 41445 1727204221.05592: starting attempt loop 41445 1727204221.05594: running the handler 41445 1727204221.05597: _low_level_execute_command(): starting 41445 1727204221.05599: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41445 1727204221.06093: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204221.06105: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204221.06117: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204221.06130: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204221.06149: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 <<< 41445 1727204221.06155: stderr chunk (state=3): >>>debug2: match not found <<< 41445 1727204221.06158: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204221.06180: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41445 1727204221.06183: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.22 is address <<< 41445 1727204221.06186: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41445 1727204221.06188: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204221.06261: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204221.06265: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204221.06267: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 <<< 41445 1727204221.06270: stderr chunk (state=3): >>>debug2: match found <<< 41445 1727204221.06271: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204221.06350: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204221.06353: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204221.06356: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204221.06390: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204221.08101: stdout chunk (state=3): >>>/root <<< 41445 1727204221.08207: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204221.08231: stderr chunk (state=3): >>><<< 41445 1727204221.08235: stdout chunk (state=3): >>><<< 41445 1727204221.08255: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204221.08266: _low_level_execute_command(): starting 41445 1727204221.08272: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204221.082557-43969-88622711708627 `" && echo ansible-tmp-1727204221.082557-43969-88622711708627="` echo /root/.ansible/tmp/ansible-tmp-1727204221.082557-43969-88622711708627 `" ) && sleep 0' 41445 1727204221.08752: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204221.08756: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204221.08758: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204221.08761: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204221.08763: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found <<< 41445 1727204221.08765: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204221.08808: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204221.08811: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204221.08814: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204221.08863: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204221.10807: stdout chunk (state=3): >>>ansible-tmp-1727204221.082557-43969-88622711708627=/root/.ansible/tmp/ansible-tmp-1727204221.082557-43969-88622711708627 <<< 41445 1727204221.10910: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204221.10942: stderr chunk (state=3): >>><<< 41445 1727204221.10945: stdout chunk (state=3): >>><<< 41445 1727204221.10961: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204221.082557-43969-88622711708627=/root/.ansible/tmp/ansible-tmp-1727204221.082557-43969-88622711708627 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204221.11012: variable 'ansible_module_compression' from source: unknown 41445 1727204221.11049: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-414450s0ylvj1/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 41445 1727204221.11077: variable 'ansible_facts' from source: unknown 41445 1727204221.11146: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204221.082557-43969-88622711708627/AnsiballZ_network_connections.py 41445 1727204221.11257: Sending initial data 41445 1727204221.11260: Sent initial data (166 bytes) 41445 1727204221.11742: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204221.11745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204221.11750: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204221.11752: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204221.11754: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found <<< 41445 1727204221.11756: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204221.11806: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204221.11809: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204221.11811: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204221.11854: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204221.13457: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41445 1727204221.13490: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41445 1727204221.13530: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-414450s0ylvj1/tmpnmxluf32 /root/.ansible/tmp/ansible-tmp-1727204221.082557-43969-88622711708627/AnsiballZ_network_connections.py <<< 41445 1727204221.13533: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204221.082557-43969-88622711708627/AnsiballZ_network_connections.py" <<< 41445 1727204221.13562: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-414450s0ylvj1/tmpnmxluf32" to remote "/root/.ansible/tmp/ansible-tmp-1727204221.082557-43969-88622711708627/AnsiballZ_network_connections.py" <<< 41445 1727204221.13566: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204221.082557-43969-88622711708627/AnsiballZ_network_connections.py" <<< 41445 1727204221.14252: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204221.14300: stderr chunk (state=3): >>><<< 41445 1727204221.14304: stdout chunk (state=3): >>><<< 41445 1727204221.14352: done transferring module to remote 41445 1727204221.14364: _low_level_execute_command(): starting 41445 1727204221.14367: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204221.082557-43969-88622711708627/ /root/.ansible/tmp/ansible-tmp-1727204221.082557-43969-88622711708627/AnsiballZ_network_connections.py && sleep 0' 41445 1727204221.14836: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204221.14840: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204221.14842: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204221.14844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204221.14884: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204221.14905: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204221.14914: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204221.14946: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204221.16784: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204221.16809: stderr chunk (state=3): >>><<< 41445 1727204221.16812: stdout chunk (state=3): >>><<< 41445 1727204221.16830: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204221.16834: _low_level_execute_command(): starting 41445 1727204221.16837: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204221.082557-43969-88622711708627/AnsiballZ_network_connections.py && sleep 0' 41445 1727204221.17520: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204221.17524: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204221.17526: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204221.17529: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204221.17536: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 <<< 41445 1727204221.17543: stderr chunk (state=3): >>>debug2: match not found <<< 41445 1727204221.17587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204221.17591: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41445 1727204221.17594: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204221.17658: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204221.17689: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204221.17760: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204221.45042: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_p_31qczw/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_p_31qczw/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on ethtest0/d37af1d3-4475-460d-968a-fd721e68b223: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 41445 1727204221.46873: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. <<< 41445 1727204221.46907: stdout chunk (state=3): >>><<< 41445 1727204221.46910: stderr chunk (state=3): >>><<< 41445 1727204221.46913: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_p_31qczw/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_p_31qczw/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on ethtest0/d37af1d3-4475-460d-968a-fd721e68b223: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. 41445 1727204221.46945: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'ethtest0', 'persistent_state': 'absent'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204221.082557-43969-88622711708627/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41445 1727204221.46985: _low_level_execute_command(): starting 41445 1727204221.46988: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204221.082557-43969-88622711708627/ > /dev/null 2>&1 && sleep 0' 41445 1727204221.47889: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204221.47896: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204221.47910: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204221.47997: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204221.49798: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204221.49832: stderr chunk (state=3): >>><<< 41445 1727204221.49849: stdout chunk (state=3): >>><<< 41445 1727204221.50081: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204221.50086: handler run complete 41445 1727204221.50088: attempt loop complete, returning result 41445 1727204221.50091: _execute() done 41445 1727204221.50093: dumping result to json 41445 1727204221.50095: done dumping result, returning 41445 1727204221.50097: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [028d2410-947f-bf02-eee4-0000000000e7] 41445 1727204221.50099: sending task result for task 028d2410-947f-bf02-eee4-0000000000e7 41445 1727204221.50180: done sending task result for task 028d2410-947f-bf02-eee4-0000000000e7 changed: [managed-node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 41445 1727204221.50273: no more pending results, returning what we have 41445 1727204221.50279: results queue empty 41445 1727204221.50280: checking for any_errors_fatal 41445 1727204221.50286: done checking for any_errors_fatal 41445 1727204221.50287: checking for max_fail_percentage 41445 1727204221.50288: done checking for max_fail_percentage 41445 1727204221.50289: checking to see if all hosts have failed and the running result is not ok 41445 1727204221.50290: done checking to see if all hosts have failed 41445 1727204221.50291: getting the remaining hosts for this loop 41445 1727204221.50292: done getting the remaining hosts for this loop 41445 1727204221.50296: getting the next task for host managed-node3 41445 1727204221.50301: done getting next task for host managed-node3 41445 1727204221.50306: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 41445 1727204221.50307: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204221.50317: getting variables 41445 1727204221.50319: in VariableManager get_vars() 41445 1727204221.50355: Calling all_inventory to load vars for managed-node3 41445 1727204221.50358: Calling groups_inventory to load vars for managed-node3 41445 1727204221.50360: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204221.50368: Calling all_plugins_play to load vars for managed-node3 41445 1727204221.50371: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204221.50374: Calling groups_plugins_play to load vars for managed-node3 41445 1727204221.51124: WORKER PROCESS EXITING 41445 1727204221.52549: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204221.54243: done with get_vars() 41445 1727204221.54278: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:57:01 -0400 (0:00:00.570) 0:00:40.331 ***** 41445 1727204221.54364: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_state 41445 1727204221.54739: worker is 1 (out of 1 available) 41445 1727204221.54751: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_state 41445 1727204221.54762: done queuing things up, now waiting for results queue to drain 41445 1727204221.54763: waiting for pending results... 41445 1727204221.55297: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state 41445 1727204221.55303: in run() - task 028d2410-947f-bf02-eee4-0000000000e8 41445 1727204221.55306: variable 'ansible_search_path' from source: unknown 41445 1727204221.55308: variable 'ansible_search_path' from source: unknown 41445 1727204221.55314: calling self._execute() 41445 1727204221.55400: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204221.55404: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204221.55416: variable 'omit' from source: magic vars 41445 1727204221.55852: variable 'ansible_distribution_major_version' from source: facts 41445 1727204221.55866: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204221.55997: variable 'network_state' from source: role '' defaults 41445 1727204221.56009: Evaluated conditional (network_state != {}): False 41445 1727204221.56015: when evaluation is False, skipping this task 41445 1727204221.56018: _execute() done 41445 1727204221.56020: dumping result to json 41445 1727204221.56023: done dumping result, returning 41445 1727204221.56026: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state [028d2410-947f-bf02-eee4-0000000000e8] 41445 1727204221.56091: sending task result for task 028d2410-947f-bf02-eee4-0000000000e8 41445 1727204221.56174: done sending task result for task 028d2410-947f-bf02-eee4-0000000000e8 41445 1727204221.56180: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 41445 1727204221.56458: no more pending results, returning what we have 41445 1727204221.56463: results queue empty 41445 1727204221.56464: checking for any_errors_fatal 41445 1727204221.56474: done checking for any_errors_fatal 41445 1727204221.56478: checking for max_fail_percentage 41445 1727204221.56480: done checking for max_fail_percentage 41445 1727204221.56481: checking to see if all hosts have failed and the running result is not ok 41445 1727204221.56482: done checking to see if all hosts have failed 41445 1727204221.56483: getting the remaining hosts for this loop 41445 1727204221.56485: done getting the remaining hosts for this loop 41445 1727204221.56489: getting the next task for host managed-node3 41445 1727204221.56494: done getting next task for host managed-node3 41445 1727204221.56498: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 41445 1727204221.56501: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204221.56515: getting variables 41445 1727204221.56517: in VariableManager get_vars() 41445 1727204221.56556: Calling all_inventory to load vars for managed-node3 41445 1727204221.56559: Calling groups_inventory to load vars for managed-node3 41445 1727204221.56562: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204221.56574: Calling all_plugins_play to load vars for managed-node3 41445 1727204221.56640: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204221.56645: Calling groups_plugins_play to load vars for managed-node3 41445 1727204221.59843: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204221.64022: done with get_vars() 41445 1727204221.64058: done getting variables 41445 1727204221.64127: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:57:01 -0400 (0:00:00.097) 0:00:40.429 ***** 41445 1727204221.64158: entering _queue_task() for managed-node3/debug 41445 1727204221.64537: worker is 1 (out of 1 available) 41445 1727204221.64551: exiting _queue_task() for managed-node3/debug 41445 1727204221.64562: done queuing things up, now waiting for results queue to drain 41445 1727204221.64564: waiting for pending results... 41445 1727204221.64998: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 41445 1727204221.65004: in run() - task 028d2410-947f-bf02-eee4-0000000000e9 41445 1727204221.65009: variable 'ansible_search_path' from source: unknown 41445 1727204221.65012: variable 'ansible_search_path' from source: unknown 41445 1727204221.65051: calling self._execute() 41445 1727204221.65166: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204221.65170: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204221.65184: variable 'omit' from source: magic vars 41445 1727204221.65577: variable 'ansible_distribution_major_version' from source: facts 41445 1727204221.65618: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204221.65621: variable 'omit' from source: magic vars 41445 1727204221.65638: variable 'omit' from source: magic vars 41445 1727204221.65681: variable 'omit' from source: magic vars 41445 1727204221.65726: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41445 1727204221.65981: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41445 1727204221.65984: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41445 1727204221.65987: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204221.65989: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204221.65991: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41445 1727204221.65997: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204221.65999: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204221.66001: Set connection var ansible_shell_executable to /bin/sh 41445 1727204221.66004: Set connection var ansible_shell_type to sh 41445 1727204221.66005: Set connection var ansible_pipelining to False 41445 1727204221.66007: Set connection var ansible_timeout to 10 41445 1727204221.66009: Set connection var ansible_connection to ssh 41445 1727204221.66011: Set connection var ansible_module_compression to ZIP_DEFLATED 41445 1727204221.66013: variable 'ansible_shell_executable' from source: unknown 41445 1727204221.66015: variable 'ansible_connection' from source: unknown 41445 1727204221.66018: variable 'ansible_module_compression' from source: unknown 41445 1727204221.66020: variable 'ansible_shell_type' from source: unknown 41445 1727204221.66022: variable 'ansible_shell_executable' from source: unknown 41445 1727204221.66024: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204221.66028: variable 'ansible_pipelining' from source: unknown 41445 1727204221.66031: variable 'ansible_timeout' from source: unknown 41445 1727204221.66035: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204221.66180: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41445 1727204221.66190: variable 'omit' from source: magic vars 41445 1727204221.66195: starting attempt loop 41445 1727204221.66198: running the handler 41445 1727204221.66360: variable '__network_connections_result' from source: set_fact 41445 1727204221.66406: handler run complete 41445 1727204221.66425: attempt loop complete, returning result 41445 1727204221.66428: _execute() done 41445 1727204221.66436: dumping result to json 41445 1727204221.66440: done dumping result, returning 41445 1727204221.66452: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [028d2410-947f-bf02-eee4-0000000000e9] 41445 1727204221.66457: sending task result for task 028d2410-947f-bf02-eee4-0000000000e9 ok: [managed-node3] => { "__network_connections_result.stderr_lines": [ "" ] } 41445 1727204221.66609: no more pending results, returning what we have 41445 1727204221.66616: results queue empty 41445 1727204221.66617: checking for any_errors_fatal 41445 1727204221.66623: done checking for any_errors_fatal 41445 1727204221.66624: checking for max_fail_percentage 41445 1727204221.66625: done checking for max_fail_percentage 41445 1727204221.66626: checking to see if all hosts have failed and the running result is not ok 41445 1727204221.66627: done checking to see if all hosts have failed 41445 1727204221.66628: getting the remaining hosts for this loop 41445 1727204221.66629: done getting the remaining hosts for this loop 41445 1727204221.66633: getting the next task for host managed-node3 41445 1727204221.66639: done getting next task for host managed-node3 41445 1727204221.66643: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 41445 1727204221.66645: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204221.66656: getting variables 41445 1727204221.66658: in VariableManager get_vars() 41445 1727204221.66698: Calling all_inventory to load vars for managed-node3 41445 1727204221.66701: Calling groups_inventory to load vars for managed-node3 41445 1727204221.66704: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204221.66716: Calling all_plugins_play to load vars for managed-node3 41445 1727204221.66720: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204221.66722: Calling groups_plugins_play to load vars for managed-node3 41445 1727204221.67290: done sending task result for task 028d2410-947f-bf02-eee4-0000000000e9 41445 1727204221.67295: WORKER PROCESS EXITING 41445 1727204221.68444: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204221.70212: done with get_vars() 41445 1727204221.70251: done getting variables 41445 1727204221.70323: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:57:01 -0400 (0:00:00.061) 0:00:40.491 ***** 41445 1727204221.70358: entering _queue_task() for managed-node3/debug 41445 1727204221.70938: worker is 1 (out of 1 available) 41445 1727204221.70953: exiting _queue_task() for managed-node3/debug 41445 1727204221.70965: done queuing things up, now waiting for results queue to drain 41445 1727204221.70967: waiting for pending results... 41445 1727204221.71245: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 41445 1727204221.71382: in run() - task 028d2410-947f-bf02-eee4-0000000000ea 41445 1727204221.71402: variable 'ansible_search_path' from source: unknown 41445 1727204221.71405: variable 'ansible_search_path' from source: unknown 41445 1727204221.71446: calling self._execute() 41445 1727204221.71556: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204221.71561: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204221.71571: variable 'omit' from source: magic vars 41445 1727204221.71988: variable 'ansible_distribution_major_version' from source: facts 41445 1727204221.72002: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204221.72009: variable 'omit' from source: magic vars 41445 1727204221.72063: variable 'omit' from source: magic vars 41445 1727204221.72101: variable 'omit' from source: magic vars 41445 1727204221.72147: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41445 1727204221.72187: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41445 1727204221.72207: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41445 1727204221.72228: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204221.72242: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204221.72274: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41445 1727204221.72279: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204221.72283: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204221.72391: Set connection var ansible_shell_executable to /bin/sh 41445 1727204221.72395: Set connection var ansible_shell_type to sh 41445 1727204221.72400: Set connection var ansible_pipelining to False 41445 1727204221.72408: Set connection var ansible_timeout to 10 41445 1727204221.72410: Set connection var ansible_connection to ssh 41445 1727204221.72580: Set connection var ansible_module_compression to ZIP_DEFLATED 41445 1727204221.72584: variable 'ansible_shell_executable' from source: unknown 41445 1727204221.72586: variable 'ansible_connection' from source: unknown 41445 1727204221.72589: variable 'ansible_module_compression' from source: unknown 41445 1727204221.72591: variable 'ansible_shell_type' from source: unknown 41445 1727204221.72594: variable 'ansible_shell_executable' from source: unknown 41445 1727204221.72597: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204221.72600: variable 'ansible_pipelining' from source: unknown 41445 1727204221.72602: variable 'ansible_timeout' from source: unknown 41445 1727204221.72605: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204221.72626: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41445 1727204221.72636: variable 'omit' from source: magic vars 41445 1727204221.72641: starting attempt loop 41445 1727204221.72644: running the handler 41445 1727204221.72694: variable '__network_connections_result' from source: set_fact 41445 1727204221.72788: variable '__network_connections_result' from source: set_fact 41445 1727204221.72938: handler run complete 41445 1727204221.72961: attempt loop complete, returning result 41445 1727204221.72970: _execute() done 41445 1727204221.72973: dumping result to json 41445 1727204221.72978: done dumping result, returning 41445 1727204221.72989: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [028d2410-947f-bf02-eee4-0000000000ea] 41445 1727204221.72995: sending task result for task 028d2410-947f-bf02-eee4-0000000000ea 41445 1727204221.73278: done sending task result for task 028d2410-947f-bf02-eee4-0000000000ea ok: [managed-node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 41445 1727204221.73356: WORKER PROCESS EXITING 41445 1727204221.73458: no more pending results, returning what we have 41445 1727204221.73462: results queue empty 41445 1727204221.73463: checking for any_errors_fatal 41445 1727204221.73468: done checking for any_errors_fatal 41445 1727204221.73469: checking for max_fail_percentage 41445 1727204221.73471: done checking for max_fail_percentage 41445 1727204221.73471: checking to see if all hosts have failed and the running result is not ok 41445 1727204221.73472: done checking to see if all hosts have failed 41445 1727204221.73473: getting the remaining hosts for this loop 41445 1727204221.73474: done getting the remaining hosts for this loop 41445 1727204221.73480: getting the next task for host managed-node3 41445 1727204221.73486: done getting next task for host managed-node3 41445 1727204221.73490: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 41445 1727204221.73492: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204221.73501: getting variables 41445 1727204221.73504: in VariableManager get_vars() 41445 1727204221.73540: Calling all_inventory to load vars for managed-node3 41445 1727204221.73543: Calling groups_inventory to load vars for managed-node3 41445 1727204221.73545: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204221.73554: Calling all_plugins_play to load vars for managed-node3 41445 1727204221.73557: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204221.73559: Calling groups_plugins_play to load vars for managed-node3 41445 1727204221.75029: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204221.76623: done with get_vars() 41445 1727204221.76654: done getting variables 41445 1727204221.76726: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:57:01 -0400 (0:00:00.063) 0:00:40.555 ***** 41445 1727204221.76765: entering _queue_task() for managed-node3/debug 41445 1727204221.77144: worker is 1 (out of 1 available) 41445 1727204221.77162: exiting _queue_task() for managed-node3/debug 41445 1727204221.77177: done queuing things up, now waiting for results queue to drain 41445 1727204221.77178: waiting for pending results... 41445 1727204221.77506: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 41445 1727204221.77603: in run() - task 028d2410-947f-bf02-eee4-0000000000eb 41445 1727204221.77621: variable 'ansible_search_path' from source: unknown 41445 1727204221.77625: variable 'ansible_search_path' from source: unknown 41445 1727204221.77661: calling self._execute() 41445 1727204221.77790: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204221.77794: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204221.77812: variable 'omit' from source: magic vars 41445 1727204221.78232: variable 'ansible_distribution_major_version' from source: facts 41445 1727204221.78249: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204221.78446: variable 'network_state' from source: role '' defaults 41445 1727204221.78450: Evaluated conditional (network_state != {}): False 41445 1727204221.78452: when evaluation is False, skipping this task 41445 1727204221.78455: _execute() done 41445 1727204221.78458: dumping result to json 41445 1727204221.78460: done dumping result, returning 41445 1727204221.78525: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [028d2410-947f-bf02-eee4-0000000000eb] 41445 1727204221.78529: sending task result for task 028d2410-947f-bf02-eee4-0000000000eb skipping: [managed-node3] => { "false_condition": "network_state != {}" } 41445 1727204221.78650: no more pending results, returning what we have 41445 1727204221.78654: results queue empty 41445 1727204221.78656: checking for any_errors_fatal 41445 1727204221.78669: done checking for any_errors_fatal 41445 1727204221.78670: checking for max_fail_percentage 41445 1727204221.78672: done checking for max_fail_percentage 41445 1727204221.78673: checking to see if all hosts have failed and the running result is not ok 41445 1727204221.78674: done checking to see if all hosts have failed 41445 1727204221.78674: getting the remaining hosts for this loop 41445 1727204221.78679: done getting the remaining hosts for this loop 41445 1727204221.78683: getting the next task for host managed-node3 41445 1727204221.78689: done getting next task for host managed-node3 41445 1727204221.78693: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 41445 1727204221.78696: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204221.78716: done sending task result for task 028d2410-947f-bf02-eee4-0000000000eb 41445 1727204221.78720: WORKER PROCESS EXITING 41445 1727204221.78946: getting variables 41445 1727204221.78949: in VariableManager get_vars() 41445 1727204221.78988: Calling all_inventory to load vars for managed-node3 41445 1727204221.78991: Calling groups_inventory to load vars for managed-node3 41445 1727204221.78993: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204221.79003: Calling all_plugins_play to load vars for managed-node3 41445 1727204221.79005: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204221.79008: Calling groups_plugins_play to load vars for managed-node3 41445 1727204221.81288: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204221.84355: done with get_vars() 41445 1727204221.84403: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:57:01 -0400 (0:00:00.077) 0:00:40.632 ***** 41445 1727204221.84513: entering _queue_task() for managed-node3/ping 41445 1727204221.85109: worker is 1 (out of 1 available) 41445 1727204221.85126: exiting _queue_task() for managed-node3/ping 41445 1727204221.85140: done queuing things up, now waiting for results queue to drain 41445 1727204221.85142: waiting for pending results... 41445 1727204221.85812: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 41445 1727204221.86084: in run() - task 028d2410-947f-bf02-eee4-0000000000ec 41445 1727204221.86089: variable 'ansible_search_path' from source: unknown 41445 1727204221.86092: variable 'ansible_search_path' from source: unknown 41445 1727204221.86095: calling self._execute() 41445 1727204221.86381: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204221.86385: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204221.86406: variable 'omit' from source: magic vars 41445 1727204221.87032: variable 'ansible_distribution_major_version' from source: facts 41445 1727204221.87036: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204221.87039: variable 'omit' from source: magic vars 41445 1727204221.87048: variable 'omit' from source: magic vars 41445 1727204221.87099: variable 'omit' from source: magic vars 41445 1727204221.87149: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41445 1727204221.87186: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41445 1727204221.87205: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41445 1727204221.87292: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204221.87296: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204221.87298: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41445 1727204221.87300: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204221.87304: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204221.87396: Set connection var ansible_shell_executable to /bin/sh 41445 1727204221.87399: Set connection var ansible_shell_type to sh 41445 1727204221.87405: Set connection var ansible_pipelining to False 41445 1727204221.87416: Set connection var ansible_timeout to 10 41445 1727204221.87419: Set connection var ansible_connection to ssh 41445 1727204221.87436: Set connection var ansible_module_compression to ZIP_DEFLATED 41445 1727204221.87507: variable 'ansible_shell_executable' from source: unknown 41445 1727204221.87510: variable 'ansible_connection' from source: unknown 41445 1727204221.87513: variable 'ansible_module_compression' from source: unknown 41445 1727204221.87515: variable 'ansible_shell_type' from source: unknown 41445 1727204221.87517: variable 'ansible_shell_executable' from source: unknown 41445 1727204221.87519: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204221.87521: variable 'ansible_pipelining' from source: unknown 41445 1727204221.87523: variable 'ansible_timeout' from source: unknown 41445 1727204221.87525: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204221.87751: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 41445 1727204221.87756: variable 'omit' from source: magic vars 41445 1727204221.87758: starting attempt loop 41445 1727204221.87760: running the handler 41445 1727204221.87822: _low_level_execute_command(): starting 41445 1727204221.87826: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41445 1727204221.88687: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204221.88692: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204221.88997: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204221.89015: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204221.89041: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204221.89105: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204221.90835: stdout chunk (state=3): >>>/root <<< 41445 1727204221.90996: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204221.91000: stdout chunk (state=3): >>><<< 41445 1727204221.91002: stderr chunk (state=3): >>><<< 41445 1727204221.91326: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204221.91331: _low_level_execute_command(): starting 41445 1727204221.91335: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204221.9120328-44093-274889999451701 `" && echo ansible-tmp-1727204221.9120328-44093-274889999451701="` echo /root/.ansible/tmp/ansible-tmp-1727204221.9120328-44093-274889999451701 `" ) && sleep 0' 41445 1727204221.92153: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204221.92162: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204221.92177: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204221.92190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204221.92202: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 <<< 41445 1727204221.92209: stderr chunk (state=3): >>>debug2: match not found <<< 41445 1727204221.92297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204221.92305: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204221.92319: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204221.92338: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204221.92408: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204221.94324: stdout chunk (state=3): >>>ansible-tmp-1727204221.9120328-44093-274889999451701=/root/.ansible/tmp/ansible-tmp-1727204221.9120328-44093-274889999451701 <<< 41445 1727204221.94567: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204221.94571: stdout chunk (state=3): >>><<< 41445 1727204221.94647: stderr chunk (state=3): >>><<< 41445 1727204221.94800: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204221.9120328-44093-274889999451701=/root/.ansible/tmp/ansible-tmp-1727204221.9120328-44093-274889999451701 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204221.94868: variable 'ansible_module_compression' from source: unknown 41445 1727204221.94982: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-414450s0ylvj1/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 41445 1727204221.94985: variable 'ansible_facts' from source: unknown 41445 1727204221.95231: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204221.9120328-44093-274889999451701/AnsiballZ_ping.py 41445 1727204221.95541: Sending initial data 41445 1727204221.95544: Sent initial data (153 bytes) 41445 1727204221.96971: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204221.97335: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204221.97339: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204221.97373: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204221.97692: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204221.99169: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41445 1727204221.99174: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41445 1727204221.99274: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-414450s0ylvj1/tmpcgk6ze4i /root/.ansible/tmp/ansible-tmp-1727204221.9120328-44093-274889999451701/AnsiballZ_ping.py <<< 41445 1727204221.99283: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204221.9120328-44093-274889999451701/AnsiballZ_ping.py" <<< 41445 1727204221.99290: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-414450s0ylvj1/tmpcgk6ze4i" to remote "/root/.ansible/tmp/ansible-tmp-1727204221.9120328-44093-274889999451701/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204221.9120328-44093-274889999451701/AnsiballZ_ping.py" <<< 41445 1727204222.00288: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204222.00437: stderr chunk (state=3): >>><<< 41445 1727204222.00441: stdout chunk (state=3): >>><<< 41445 1727204222.00462: done transferring module to remote 41445 1727204222.00472: _low_level_execute_command(): starting 41445 1727204222.00480: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204221.9120328-44093-274889999451701/ /root/.ansible/tmp/ansible-tmp-1727204221.9120328-44093-274889999451701/AnsiballZ_ping.py && sleep 0' 41445 1727204222.01800: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204222.01841: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204222.01851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204222.01882: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204222.01886: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 <<< 41445 1727204222.01889: stderr chunk (state=3): >>>debug2: match not found <<< 41445 1727204222.01919: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204222.01926: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41445 1727204222.01929: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.22 is address <<< 41445 1727204222.01937: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41445 1727204222.01939: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204222.02029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204222.02033: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204222.02298: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204222.02363: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204222.02393: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204222.04322: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204222.04326: stderr chunk (state=3): >>><<< 41445 1727204222.04329: stdout chunk (state=3): >>><<< 41445 1727204222.04331: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204222.04333: _low_level_execute_command(): starting 41445 1727204222.04335: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204221.9120328-44093-274889999451701/AnsiballZ_ping.py && sleep 0' 41445 1727204222.05677: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204222.05771: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204222.05778: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204222.05780: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204222.05828: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204222.20733: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 41445 1727204222.22284: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. <<< 41445 1727204222.22288: stderr chunk (state=3): >>><<< 41445 1727204222.22290: stdout chunk (state=3): >>><<< 41445 1727204222.22324: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. 41445 1727204222.22380: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204221.9120328-44093-274889999451701/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41445 1727204222.22458: _low_level_execute_command(): starting 41445 1727204222.22461: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204221.9120328-44093-274889999451701/ > /dev/null 2>&1 && sleep 0' 41445 1727204222.23563: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204222.23783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204222.23900: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204222.24121: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204222.24255: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204222.26106: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204222.26112: stdout chunk (state=3): >>><<< 41445 1727204222.26122: stderr chunk (state=3): >>><<< 41445 1727204222.26198: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204222.26232: handler run complete 41445 1727204222.26247: attempt loop complete, returning result 41445 1727204222.26251: _execute() done 41445 1727204222.26255: dumping result to json 41445 1727204222.26259: done dumping result, returning 41445 1727204222.26270: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [028d2410-947f-bf02-eee4-0000000000ec] 41445 1727204222.26369: sending task result for task 028d2410-947f-bf02-eee4-0000000000ec 41445 1727204222.26481: done sending task result for task 028d2410-947f-bf02-eee4-0000000000ec 41445 1727204222.26485: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "ping": "pong" } 41445 1727204222.26606: no more pending results, returning what we have 41445 1727204222.26612: results queue empty 41445 1727204222.26613: checking for any_errors_fatal 41445 1727204222.26617: done checking for any_errors_fatal 41445 1727204222.26618: checking for max_fail_percentage 41445 1727204222.26620: done checking for max_fail_percentage 41445 1727204222.26621: checking to see if all hosts have failed and the running result is not ok 41445 1727204222.26622: done checking to see if all hosts have failed 41445 1727204222.26622: getting the remaining hosts for this loop 41445 1727204222.26624: done getting the remaining hosts for this loop 41445 1727204222.26628: getting the next task for host managed-node3 41445 1727204222.26635: done getting next task for host managed-node3 41445 1727204222.26637: ^ task is: TASK: meta (role_complete) 41445 1727204222.26639: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204222.26650: getting variables 41445 1727204222.26652: in VariableManager get_vars() 41445 1727204222.26924: Calling all_inventory to load vars for managed-node3 41445 1727204222.26927: Calling groups_inventory to load vars for managed-node3 41445 1727204222.26930: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204222.26940: Calling all_plugins_play to load vars for managed-node3 41445 1727204222.26943: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204222.26946: Calling groups_plugins_play to load vars for managed-node3 41445 1727204222.28734: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204222.31029: done with get_vars() 41445 1727204222.31066: done getting variables 41445 1727204222.31163: done queuing things up, now waiting for results queue to drain 41445 1727204222.31165: results queue empty 41445 1727204222.31166: checking for any_errors_fatal 41445 1727204222.31169: done checking for any_errors_fatal 41445 1727204222.31170: checking for max_fail_percentage 41445 1727204222.31171: done checking for max_fail_percentage 41445 1727204222.31172: checking to see if all hosts have failed and the running result is not ok 41445 1727204222.31172: done checking to see if all hosts have failed 41445 1727204222.31173: getting the remaining hosts for this loop 41445 1727204222.31174: done getting the remaining hosts for this loop 41445 1727204222.31180: getting the next task for host managed-node3 41445 1727204222.31184: done getting next task for host managed-node3 41445 1727204222.31186: ^ task is: TASK: meta (flush_handlers) 41445 1727204222.31187: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204222.31190: getting variables 41445 1727204222.31191: in VariableManager get_vars() 41445 1727204222.31205: Calling all_inventory to load vars for managed-node3 41445 1727204222.31207: Calling groups_inventory to load vars for managed-node3 41445 1727204222.31209: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204222.31217: Calling all_plugins_play to load vars for managed-node3 41445 1727204222.31219: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204222.31222: Calling groups_plugins_play to load vars for managed-node3 41445 1727204222.33284: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204222.35886: done with get_vars() 41445 1727204222.35960: done getting variables 41445 1727204222.36021: in VariableManager get_vars() 41445 1727204222.36035: Calling all_inventory to load vars for managed-node3 41445 1727204222.36038: Calling groups_inventory to load vars for managed-node3 41445 1727204222.36040: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204222.36045: Calling all_plugins_play to load vars for managed-node3 41445 1727204222.36047: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204222.36050: Calling groups_plugins_play to load vars for managed-node3 41445 1727204222.38183: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204222.40760: done with get_vars() 41445 1727204222.40854: done queuing things up, now waiting for results queue to drain 41445 1727204222.40857: results queue empty 41445 1727204222.40858: checking for any_errors_fatal 41445 1727204222.40860: done checking for any_errors_fatal 41445 1727204222.40861: checking for max_fail_percentage 41445 1727204222.40862: done checking for max_fail_percentage 41445 1727204222.40862: checking to see if all hosts have failed and the running result is not ok 41445 1727204222.40863: done checking to see if all hosts have failed 41445 1727204222.40864: getting the remaining hosts for this loop 41445 1727204222.40865: done getting the remaining hosts for this loop 41445 1727204222.40868: getting the next task for host managed-node3 41445 1727204222.40873: done getting next task for host managed-node3 41445 1727204222.40874: ^ task is: TASK: meta (flush_handlers) 41445 1727204222.40947: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204222.40957: getting variables 41445 1727204222.40959: in VariableManager get_vars() 41445 1727204222.40974: Calling all_inventory to load vars for managed-node3 41445 1727204222.40979: Calling groups_inventory to load vars for managed-node3 41445 1727204222.40981: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204222.40988: Calling all_plugins_play to load vars for managed-node3 41445 1727204222.40990: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204222.40992: Calling groups_plugins_play to load vars for managed-node3 41445 1727204222.43754: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204222.46244: done with get_vars() 41445 1727204222.46283: done getting variables 41445 1727204222.46343: in VariableManager get_vars() 41445 1727204222.46363: Calling all_inventory to load vars for managed-node3 41445 1727204222.46366: Calling groups_inventory to load vars for managed-node3 41445 1727204222.46368: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204222.46374: Calling all_plugins_play to load vars for managed-node3 41445 1727204222.46377: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204222.46380: Calling groups_plugins_play to load vars for managed-node3 41445 1727204222.49622: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204222.53164: done with get_vars() 41445 1727204222.53414: done queuing things up, now waiting for results queue to drain 41445 1727204222.53417: results queue empty 41445 1727204222.53418: checking for any_errors_fatal 41445 1727204222.53420: done checking for any_errors_fatal 41445 1727204222.53420: checking for max_fail_percentage 41445 1727204222.53422: done checking for max_fail_percentage 41445 1727204222.53422: checking to see if all hosts have failed and the running result is not ok 41445 1727204222.53423: done checking to see if all hosts have failed 41445 1727204222.53424: getting the remaining hosts for this loop 41445 1727204222.53425: done getting the remaining hosts for this loop 41445 1727204222.53428: getting the next task for host managed-node3 41445 1727204222.53431: done getting next task for host managed-node3 41445 1727204222.53432: ^ task is: None 41445 1727204222.53434: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204222.53435: done queuing things up, now waiting for results queue to drain 41445 1727204222.53436: results queue empty 41445 1727204222.53436: checking for any_errors_fatal 41445 1727204222.53437: done checking for any_errors_fatal 41445 1727204222.53438: checking for max_fail_percentage 41445 1727204222.53438: done checking for max_fail_percentage 41445 1727204222.53439: checking to see if all hosts have failed and the running result is not ok 41445 1727204222.53440: done checking to see if all hosts have failed 41445 1727204222.53441: getting the next task for host managed-node3 41445 1727204222.53443: done getting next task for host managed-node3 41445 1727204222.53444: ^ task is: None 41445 1727204222.53445: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204222.53773: in VariableManager get_vars() 41445 1727204222.53797: done with get_vars() 41445 1727204222.53804: in VariableManager get_vars() 41445 1727204222.53815: done with get_vars() 41445 1727204222.53819: variable 'omit' from source: magic vars 41445 1727204222.53851: in VariableManager get_vars() 41445 1727204222.53861: done with get_vars() 41445 1727204222.54021: variable 'omit' from source: magic vars PLAY [Assert device and profile are absent] ************************************ 41445 1727204222.54423: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 41445 1727204222.54693: getting the remaining hosts for this loop 41445 1727204222.54695: done getting the remaining hosts for this loop 41445 1727204222.54698: getting the next task for host managed-node3 41445 1727204222.54701: done getting next task for host managed-node3 41445 1727204222.54704: ^ task is: TASK: Gathering Facts 41445 1727204222.54706: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204222.54708: getting variables 41445 1727204222.54709: in VariableManager get_vars() 41445 1727204222.54721: Calling all_inventory to load vars for managed-node3 41445 1727204222.54724: Calling groups_inventory to load vars for managed-node3 41445 1727204222.54726: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204222.54732: Calling all_plugins_play to load vars for managed-node3 41445 1727204222.54734: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204222.54737: Calling groups_plugins_play to load vars for managed-node3 41445 1727204222.57870: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204222.61919: done with get_vars() 41445 1727204222.61954: done getting variables 41445 1727204222.62004: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:149 Tuesday 24 September 2024 14:57:02 -0400 (0:00:00.775) 0:00:41.408 ***** 41445 1727204222.62034: entering _queue_task() for managed-node3/gather_facts 41445 1727204222.62791: worker is 1 (out of 1 available) 41445 1727204222.62804: exiting _queue_task() for managed-node3/gather_facts 41445 1727204222.62818: done queuing things up, now waiting for results queue to drain 41445 1727204222.62819: waiting for pending results... 41445 1727204222.63369: running TaskExecutor() for managed-node3/TASK: Gathering Facts 41445 1727204222.63688: in run() - task 028d2410-947f-bf02-eee4-00000000085b 41445 1727204222.63716: variable 'ansible_search_path' from source: unknown 41445 1727204222.63761: calling self._execute() 41445 1727204222.64192: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204222.64197: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204222.64200: variable 'omit' from source: magic vars 41445 1727204222.64840: variable 'ansible_distribution_major_version' from source: facts 41445 1727204222.65066: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204222.65070: variable 'omit' from source: magic vars 41445 1727204222.65073: variable 'omit' from source: magic vars 41445 1727204222.65077: variable 'omit' from source: magic vars 41445 1727204222.65206: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41445 1727204222.65251: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41445 1727204222.65303: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41445 1727204222.65480: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204222.65484: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204222.65486: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41445 1727204222.65488: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204222.65490: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204222.65693: Set connection var ansible_shell_executable to /bin/sh 41445 1727204222.65826: Set connection var ansible_shell_type to sh 41445 1727204222.65829: Set connection var ansible_pipelining to False 41445 1727204222.65832: Set connection var ansible_timeout to 10 41445 1727204222.65834: Set connection var ansible_connection to ssh 41445 1727204222.65836: Set connection var ansible_module_compression to ZIP_DEFLATED 41445 1727204222.65864: variable 'ansible_shell_executable' from source: unknown 41445 1727204222.65904: variable 'ansible_connection' from source: unknown 41445 1727204222.65916: variable 'ansible_module_compression' from source: unknown 41445 1727204222.65923: variable 'ansible_shell_type' from source: unknown 41445 1727204222.65935: variable 'ansible_shell_executable' from source: unknown 41445 1727204222.65942: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204222.65949: variable 'ansible_pipelining' from source: unknown 41445 1727204222.65956: variable 'ansible_timeout' from source: unknown 41445 1727204222.65963: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204222.66172: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41445 1727204222.66189: variable 'omit' from source: magic vars 41445 1727204222.66198: starting attempt loop 41445 1727204222.66204: running the handler 41445 1727204222.66226: variable 'ansible_facts' from source: unknown 41445 1727204222.66262: _low_level_execute_command(): starting 41445 1727204222.66266: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41445 1727204222.67085: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204222.67114: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204222.67208: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204222.67244: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204222.67380: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204222.69061: stdout chunk (state=3): >>>/root <<< 41445 1727204222.69454: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204222.69459: stdout chunk (state=3): >>><<< 41445 1727204222.69461: stderr chunk (state=3): >>><<< 41445 1727204222.69465: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204222.69468: _low_level_execute_command(): starting 41445 1727204222.69470: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204222.6935039-44233-121135581123806 `" && echo ansible-tmp-1727204222.6935039-44233-121135581123806="` echo /root/.ansible/tmp/ansible-tmp-1727204222.6935039-44233-121135581123806 `" ) && sleep 0' 41445 1727204222.70614: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204222.70666: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204222.70771: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204222.71038: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204222.71055: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204222.72931: stdout chunk (state=3): >>>ansible-tmp-1727204222.6935039-44233-121135581123806=/root/.ansible/tmp/ansible-tmp-1727204222.6935039-44233-121135581123806 <<< 41445 1727204222.73088: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204222.73091: stdout chunk (state=3): >>><<< 41445 1727204222.73093: stderr chunk (state=3): >>><<< 41445 1727204222.73116: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204222.6935039-44233-121135581123806=/root/.ansible/tmp/ansible-tmp-1727204222.6935039-44233-121135581123806 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204222.73180: variable 'ansible_module_compression' from source: unknown 41445 1727204222.73325: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-414450s0ylvj1/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 41445 1727204222.73590: variable 'ansible_facts' from source: unknown 41445 1727204222.73960: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204222.6935039-44233-121135581123806/AnsiballZ_setup.py 41445 1727204222.74609: Sending initial data 41445 1727204222.74613: Sent initial data (154 bytes) 41445 1727204222.75702: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204222.75721: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204222.77230: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41445 1727204222.77262: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41445 1727204222.77303: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-414450s0ylvj1/tmpsvz_fova /root/.ansible/tmp/ansible-tmp-1727204222.6935039-44233-121135581123806/AnsiballZ_setup.py <<< 41445 1727204222.77307: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204222.6935039-44233-121135581123806/AnsiballZ_setup.py" <<< 41445 1727204222.77333: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-414450s0ylvj1/tmpsvz_fova" to remote "/root/.ansible/tmp/ansible-tmp-1727204222.6935039-44233-121135581123806/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204222.6935039-44233-121135581123806/AnsiballZ_setup.py" <<< 41445 1727204222.79998: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204222.80014: stdout chunk (state=3): >>><<< 41445 1727204222.80029: stderr chunk (state=3): >>><<< 41445 1727204222.80100: done transferring module to remote 41445 1727204222.80126: _low_level_execute_command(): starting 41445 1727204222.80191: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204222.6935039-44233-121135581123806/ /root/.ansible/tmp/ansible-tmp-1727204222.6935039-44233-121135581123806/AnsiballZ_setup.py && sleep 0' 41445 1727204222.81443: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204222.81507: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204222.81694: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204222.81722: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204222.81749: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204222.81800: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204222.81953: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204222.83742: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204222.83754: stdout chunk (state=3): >>><<< 41445 1727204222.83766: stderr chunk (state=3): >>><<< 41445 1727204222.83797: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204222.83973: _low_level_execute_command(): starting 41445 1727204222.83980: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204222.6935039-44233-121135581123806/AnsiballZ_setup.py && sleep 0' 41445 1727204222.85156: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204222.85380: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204222.85500: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204222.85573: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204223.48336: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_fips": false, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec25272c602494034078bc876e25857f", "ansible_local": {}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC83yKiaGAYjfqsqlfpPMYFAYI2IZVpX8WgNfvPCdI/QOKjuzs4b6SbV/Bm0ogtp9Et9STTGoHBvp3tMYQ6i0y/9DHMBxtiHYJ+rIzJ/YNNMqKc+fMqexyOsi7rKKuzfxXrYU8uPBHq5WU3WAZmJxJn+UHZGog8hUnJ8momdJG+aYo9El3Qce4gVdwORcmHZUOa49M8lLCwTovtYArmkGETUVJ+Jk8huVTzYpASWxxcw6zOvUcn52HC6dmNQv/T+k2uW6UW0rybwIrVUlZXRNODrXs8kCGgOx1OI0XYB3FndJOnORF4A9Y6onLo/zUCEaO8Pi19mcfSbo2v+bmotTVk5jcmvR3jhVYJmJE6a+dQpjSZolSqMv8mI9tkztfxM6bJlNSZcTrvZEzu7cbiE38Pp/Ku143n9iGgWNmUQ2FhUDpoWEhoA767VUunE48P8ivpVZ/u5aEyupZSLEuWEvCLmmGPVcf9hVbcXw0n8RFvUwDdD8WARVhN5GJFUVN5JM0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHm1sMDuVWGACN5pHFDkl6tR93F90YCY4cFGcXcCoQnN+oT963FmBwTMMlfDIm4G2OUATCZuz6QFZP9trAaUzXo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIAB71QKijTWbanEvrb0ex0kLr0wX6qyv6naldRWNiIFP", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_is_chroot": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fibre_channel_wwn": [], "ansible_lsb": {}, "ansible_loadavg": {"1m": 0.64697265625, "5m": 0.5439453125, "15m": 0.32177734375}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_iscsi_iqn": "", "ansible_apparmor": {"status": "disabled"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 53526 10.31.47.22 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 53526 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_interfaces": ["lo", "eth0", "rpltstbr"], "ansible_eth0": {"device": "eth0", "macaddress": "02:83:38:1a:ae:4d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.47.22", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::83:38ff:fe1a:ae4d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "26:cf:9a:9b:f7:ee", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.47.22", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:83:38:1a:ae:4d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.47.22", "192.0.2.72"], "ansible_all_ipv6_addresses": ["fe80::83:38ff:fe1a:ae4d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.47.22", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::83:38ff:fe1a:ae4d"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "57", "second": "03", "epoch": "1727204223", "epoch_int": "1727204223", "date": "2024-09-24", "time": "14:57:03", "iso8601_micro": "2024-09-24T18:57:03.172939Z", "iso8601": "2024-09-24T18:57:03Z", "iso8601_basic": "20240924T145703172939", "iso8601_basic_short": "20240924T145703", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_service_mgr": "systemd", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_pkg_mgr": "dnf", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2931, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 600, "free": 2931}, "nocache": {"free": 3271, "used": 260}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec25272c-6024-9403-4078-bc876e25857f", "ansible_product_uuid": "ec25272c-6024-9403-4078-bc876e25857f", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 800, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261788733440, "block_size": 4096, "block_total": 65519099, "block_available": 63913265, "block_used": 1605834, "inode_total": 131070960, "inode_available": 131027341, "inode_used": 43619, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 41445 1727204223.50233: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. <<< 41445 1727204223.50447: stdout chunk (state=3): >>><<< 41445 1727204223.50451: stderr chunk (state=3): >>><<< 41445 1727204223.50455: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_fips": false, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec25272c602494034078bc876e25857f", "ansible_local": {}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC83yKiaGAYjfqsqlfpPMYFAYI2IZVpX8WgNfvPCdI/QOKjuzs4b6SbV/Bm0ogtp9Et9STTGoHBvp3tMYQ6i0y/9DHMBxtiHYJ+rIzJ/YNNMqKc+fMqexyOsi7rKKuzfxXrYU8uPBHq5WU3WAZmJxJn+UHZGog8hUnJ8momdJG+aYo9El3Qce4gVdwORcmHZUOa49M8lLCwTovtYArmkGETUVJ+Jk8huVTzYpASWxxcw6zOvUcn52HC6dmNQv/T+k2uW6UW0rybwIrVUlZXRNODrXs8kCGgOx1OI0XYB3FndJOnORF4A9Y6onLo/zUCEaO8Pi19mcfSbo2v+bmotTVk5jcmvR3jhVYJmJE6a+dQpjSZolSqMv8mI9tkztfxM6bJlNSZcTrvZEzu7cbiE38Pp/Ku143n9iGgWNmUQ2FhUDpoWEhoA767VUunE48P8ivpVZ/u5aEyupZSLEuWEvCLmmGPVcf9hVbcXw0n8RFvUwDdD8WARVhN5GJFUVN5JM0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHm1sMDuVWGACN5pHFDkl6tR93F90YCY4cFGcXcCoQnN+oT963FmBwTMMlfDIm4G2OUATCZuz6QFZP9trAaUzXo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIAB71QKijTWbanEvrb0ex0kLr0wX6qyv6naldRWNiIFP", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_is_chroot": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fibre_channel_wwn": [], "ansible_lsb": {}, "ansible_loadavg": {"1m": 0.64697265625, "5m": 0.5439453125, "15m": 0.32177734375}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_iscsi_iqn": "", "ansible_apparmor": {"status": "disabled"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 53526 10.31.47.22 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 53526 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_interfaces": ["lo", "eth0", "rpltstbr"], "ansible_eth0": {"device": "eth0", "macaddress": "02:83:38:1a:ae:4d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.47.22", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::83:38ff:fe1a:ae4d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "26:cf:9a:9b:f7:ee", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.47.22", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:83:38:1a:ae:4d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.47.22", "192.0.2.72"], "ansible_all_ipv6_addresses": ["fe80::83:38ff:fe1a:ae4d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.47.22", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::83:38ff:fe1a:ae4d"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "57", "second": "03", "epoch": "1727204223", "epoch_int": "1727204223", "date": "2024-09-24", "time": "14:57:03", "iso8601_micro": "2024-09-24T18:57:03.172939Z", "iso8601": "2024-09-24T18:57:03Z", "iso8601_basic": "20240924T145703172939", "iso8601_basic_short": "20240924T145703", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_service_mgr": "systemd", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_pkg_mgr": "dnf", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2931, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 600, "free": 2931}, "nocache": {"free": 3271, "used": 260}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec25272c-6024-9403-4078-bc876e25857f", "ansible_product_uuid": "ec25272c-6024-9403-4078-bc876e25857f", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 800, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261788733440, "block_size": 4096, "block_total": 65519099, "block_available": 63913265, "block_used": 1605834, "inode_total": 131070960, "inode_available": 131027341, "inode_used": 43619, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. 41445 1727204223.51724: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204222.6935039-44233-121135581123806/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41445 1727204223.51750: _low_level_execute_command(): starting 41445 1727204223.51990: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204222.6935039-44233-121135581123806/ > /dev/null 2>&1 && sleep 0' 41445 1727204223.53330: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204223.53570: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204223.53602: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204223.53780: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204223.55585: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204223.55632: stderr chunk (state=3): >>><<< 41445 1727204223.55690: stdout chunk (state=3): >>><<< 41445 1727204223.55715: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204223.55796: handler run complete 41445 1727204223.55951: variable 'ansible_facts' from source: unknown 41445 1727204223.56406: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204223.57082: variable 'ansible_facts' from source: unknown 41445 1727204223.57293: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204223.57617: attempt loop complete, returning result 41445 1727204223.57628: _execute() done 41445 1727204223.57635: dumping result to json 41445 1727204223.57682: done dumping result, returning 41445 1727204223.57770: done running TaskExecutor() for managed-node3/TASK: Gathering Facts [028d2410-947f-bf02-eee4-00000000085b] 41445 1727204223.57785: sending task result for task 028d2410-947f-bf02-eee4-00000000085b ok: [managed-node3] 41445 1727204223.59584: no more pending results, returning what we have 41445 1727204223.59628: done sending task result for task 028d2410-947f-bf02-eee4-00000000085b 41445 1727204223.59631: WORKER PROCESS EXITING 41445 1727204223.59634: results queue empty 41445 1727204223.59635: checking for any_errors_fatal 41445 1727204223.59636: done checking for any_errors_fatal 41445 1727204223.59637: checking for max_fail_percentage 41445 1727204223.59638: done checking for max_fail_percentage 41445 1727204223.59639: checking to see if all hosts have failed and the running result is not ok 41445 1727204223.59640: done checking to see if all hosts have failed 41445 1727204223.59641: getting the remaining hosts for this loop 41445 1727204223.59642: done getting the remaining hosts for this loop 41445 1727204223.59646: getting the next task for host managed-node3 41445 1727204223.59651: done getting next task for host managed-node3 41445 1727204223.59653: ^ task is: TASK: meta (flush_handlers) 41445 1727204223.59655: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204223.59659: getting variables 41445 1727204223.59660: in VariableManager get_vars() 41445 1727204223.59799: Calling all_inventory to load vars for managed-node3 41445 1727204223.59803: Calling groups_inventory to load vars for managed-node3 41445 1727204223.59807: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204223.59818: Calling all_plugins_play to load vars for managed-node3 41445 1727204223.59821: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204223.59824: Calling groups_plugins_play to load vars for managed-node3 41445 1727204223.74041: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204223.77632: done with get_vars() 41445 1727204223.77665: done getting variables 41445 1727204223.77850: in VariableManager get_vars() 41445 1727204223.77860: Calling all_inventory to load vars for managed-node3 41445 1727204223.77863: Calling groups_inventory to load vars for managed-node3 41445 1727204223.77865: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204223.77870: Calling all_plugins_play to load vars for managed-node3 41445 1727204223.77872: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204223.77922: Calling groups_plugins_play to load vars for managed-node3 41445 1727204223.80407: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204223.82899: done with get_vars() 41445 1727204223.82941: done queuing things up, now waiting for results queue to drain 41445 1727204223.82944: results queue empty 41445 1727204223.82945: checking for any_errors_fatal 41445 1727204223.82949: done checking for any_errors_fatal 41445 1727204223.82950: checking for max_fail_percentage 41445 1727204223.82951: done checking for max_fail_percentage 41445 1727204223.82952: checking to see if all hosts have failed and the running result is not ok 41445 1727204223.82953: done checking to see if all hosts have failed 41445 1727204223.82959: getting the remaining hosts for this loop 41445 1727204223.82960: done getting the remaining hosts for this loop 41445 1727204223.82963: getting the next task for host managed-node3 41445 1727204223.82967: done getting next task for host managed-node3 41445 1727204223.82970: ^ task is: TASK: Include the task 'assert_profile_absent.yml' 41445 1727204223.82971: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204223.82973: getting variables 41445 1727204223.82974: in VariableManager get_vars() 41445 1727204223.82986: Calling all_inventory to load vars for managed-node3 41445 1727204223.82989: Calling groups_inventory to load vars for managed-node3 41445 1727204223.82991: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204223.82997: Calling all_plugins_play to load vars for managed-node3 41445 1727204223.83000: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204223.83002: Calling groups_plugins_play to load vars for managed-node3 41445 1727204223.84315: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204223.85998: done with get_vars() 41445 1727204223.86025: done getting variables TASK [Include the task 'assert_profile_absent.yml'] **************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:152 Tuesday 24 September 2024 14:57:03 -0400 (0:00:01.240) 0:00:42.648 ***** 41445 1727204223.86109: entering _queue_task() for managed-node3/include_tasks 41445 1727204223.86590: worker is 1 (out of 1 available) 41445 1727204223.86602: exiting _queue_task() for managed-node3/include_tasks 41445 1727204223.86613: done queuing things up, now waiting for results queue to drain 41445 1727204223.86614: waiting for pending results... 41445 1727204223.86818: running TaskExecutor() for managed-node3/TASK: Include the task 'assert_profile_absent.yml' 41445 1727204223.86956: in run() - task 028d2410-947f-bf02-eee4-0000000000ef 41445 1727204223.87059: variable 'ansible_search_path' from source: unknown 41445 1727204223.87062: calling self._execute() 41445 1727204223.87129: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204223.87139: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204223.87152: variable 'omit' from source: magic vars 41445 1727204223.87567: variable 'ansible_distribution_major_version' from source: facts 41445 1727204223.87591: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204223.87612: _execute() done 41445 1727204223.87621: dumping result to json 41445 1727204223.87708: done dumping result, returning 41445 1727204223.87712: done running TaskExecutor() for managed-node3/TASK: Include the task 'assert_profile_absent.yml' [028d2410-947f-bf02-eee4-0000000000ef] 41445 1727204223.87714: sending task result for task 028d2410-947f-bf02-eee4-0000000000ef 41445 1727204223.87881: done sending task result for task 028d2410-947f-bf02-eee4-0000000000ef 41445 1727204223.87885: WORKER PROCESS EXITING 41445 1727204223.87922: no more pending results, returning what we have 41445 1727204223.87928: in VariableManager get_vars() 41445 1727204223.87965: Calling all_inventory to load vars for managed-node3 41445 1727204223.87969: Calling groups_inventory to load vars for managed-node3 41445 1727204223.87972: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204223.88119: Calling all_plugins_play to load vars for managed-node3 41445 1727204223.88124: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204223.88128: Calling groups_plugins_play to load vars for managed-node3 41445 1727204223.89690: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204223.92801: done with get_vars() 41445 1727204223.92954: variable 'ansible_search_path' from source: unknown 41445 1727204223.93004: we have included files to process 41445 1727204223.93006: generating all_blocks data 41445 1727204223.93007: done generating all_blocks data 41445 1727204223.93008: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 41445 1727204223.93009: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 41445 1727204223.93012: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 41445 1727204223.93640: in VariableManager get_vars() 41445 1727204223.93661: done with get_vars() 41445 1727204223.94195: done processing included file 41445 1727204223.94197: iterating over new_blocks loaded from include file 41445 1727204223.94199: in VariableManager get_vars() 41445 1727204223.94215: done with get_vars() 41445 1727204223.94217: filtering new block on tags 41445 1727204223.94235: done filtering new block on tags 41445 1727204223.94238: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml for managed-node3 41445 1727204223.94244: extending task lists for all hosts with included blocks 41445 1727204223.94442: done extending task lists 41445 1727204223.94444: done processing included files 41445 1727204223.94445: results queue empty 41445 1727204223.94446: checking for any_errors_fatal 41445 1727204223.94447: done checking for any_errors_fatal 41445 1727204223.94448: checking for max_fail_percentage 41445 1727204223.94449: done checking for max_fail_percentage 41445 1727204223.94450: checking to see if all hosts have failed and the running result is not ok 41445 1727204223.94451: done checking to see if all hosts have failed 41445 1727204223.94451: getting the remaining hosts for this loop 41445 1727204223.94452: done getting the remaining hosts for this loop 41445 1727204223.94455: getting the next task for host managed-node3 41445 1727204223.94459: done getting next task for host managed-node3 41445 1727204223.94462: ^ task is: TASK: Include the task 'get_profile_stat.yml' 41445 1727204223.94464: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204223.94467: getting variables 41445 1727204223.94468: in VariableManager get_vars() 41445 1727204223.94549: Calling all_inventory to load vars for managed-node3 41445 1727204223.94553: Calling groups_inventory to load vars for managed-node3 41445 1727204223.94555: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204223.94562: Calling all_plugins_play to load vars for managed-node3 41445 1727204223.94564: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204223.94567: Calling groups_plugins_play to load vars for managed-node3 41445 1727204223.98744: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204224.03031: done with get_vars() 41445 1727204224.03066: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:3 Tuesday 24 September 2024 14:57:04 -0400 (0:00:00.171) 0:00:42.820 ***** 41445 1727204224.03257: entering _queue_task() for managed-node3/include_tasks 41445 1727204224.04093: worker is 1 (out of 1 available) 41445 1727204224.04107: exiting _queue_task() for managed-node3/include_tasks 41445 1727204224.04118: done queuing things up, now waiting for results queue to drain 41445 1727204224.04119: waiting for pending results... 41445 1727204224.04655: running TaskExecutor() for managed-node3/TASK: Include the task 'get_profile_stat.yml' 41445 1727204224.04981: in run() - task 028d2410-947f-bf02-eee4-00000000086c 41445 1727204224.05004: variable 'ansible_search_path' from source: unknown 41445 1727204224.05042: variable 'ansible_search_path' from source: unknown 41445 1727204224.05173: calling self._execute() 41445 1727204224.05409: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204224.05423: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204224.05682: variable 'omit' from source: magic vars 41445 1727204224.06408: variable 'ansible_distribution_major_version' from source: facts 41445 1727204224.06472: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204224.06517: _execute() done 41445 1727204224.06529: dumping result to json 41445 1727204224.06541: done dumping result, returning 41445 1727204224.06656: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_profile_stat.yml' [028d2410-947f-bf02-eee4-00000000086c] 41445 1727204224.06659: sending task result for task 028d2410-947f-bf02-eee4-00000000086c 41445 1727204224.06811: no more pending results, returning what we have 41445 1727204224.06818: in VariableManager get_vars() 41445 1727204224.06863: Calling all_inventory to load vars for managed-node3 41445 1727204224.06866: Calling groups_inventory to load vars for managed-node3 41445 1727204224.06870: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204224.06971: Calling all_plugins_play to load vars for managed-node3 41445 1727204224.06982: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204224.06987: Calling groups_plugins_play to load vars for managed-node3 41445 1727204224.07619: done sending task result for task 028d2410-947f-bf02-eee4-00000000086c 41445 1727204224.07623: WORKER PROCESS EXITING 41445 1727204224.08964: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204224.12223: done with get_vars() 41445 1727204224.12254: variable 'ansible_search_path' from source: unknown 41445 1727204224.12256: variable 'ansible_search_path' from source: unknown 41445 1727204224.12305: we have included files to process 41445 1727204224.12307: generating all_blocks data 41445 1727204224.12309: done generating all_blocks data 41445 1727204224.12310: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 41445 1727204224.12311: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 41445 1727204224.12313: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 41445 1727204224.13733: done processing included file 41445 1727204224.13737: iterating over new_blocks loaded from include file 41445 1727204224.13738: in VariableManager get_vars() 41445 1727204224.13757: done with get_vars() 41445 1727204224.13759: filtering new block on tags 41445 1727204224.13786: done filtering new block on tags 41445 1727204224.13790: in VariableManager get_vars() 41445 1727204224.13804: done with get_vars() 41445 1727204224.13805: filtering new block on tags 41445 1727204224.13827: done filtering new block on tags 41445 1727204224.13829: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node3 41445 1727204224.13835: extending task lists for all hosts with included blocks 41445 1727204224.14262: done extending task lists 41445 1727204224.14264: done processing included files 41445 1727204224.14265: results queue empty 41445 1727204224.14265: checking for any_errors_fatal 41445 1727204224.14269: done checking for any_errors_fatal 41445 1727204224.14270: checking for max_fail_percentage 41445 1727204224.14271: done checking for max_fail_percentage 41445 1727204224.14272: checking to see if all hosts have failed and the running result is not ok 41445 1727204224.14272: done checking to see if all hosts have failed 41445 1727204224.14273: getting the remaining hosts for this loop 41445 1727204224.14274: done getting the remaining hosts for this loop 41445 1727204224.14279: getting the next task for host managed-node3 41445 1727204224.14283: done getting next task for host managed-node3 41445 1727204224.14285: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 41445 1727204224.14288: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204224.14291: getting variables 41445 1727204224.14292: in VariableManager get_vars() 41445 1727204224.14600: Calling all_inventory to load vars for managed-node3 41445 1727204224.14603: Calling groups_inventory to load vars for managed-node3 41445 1727204224.14606: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204224.14612: Calling all_plugins_play to load vars for managed-node3 41445 1727204224.14615: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204224.14618: Calling groups_plugins_play to load vars for managed-node3 41445 1727204224.17623: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204224.21724: done with get_vars() 41445 1727204224.21850: done getting variables 41445 1727204224.22048: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 14:57:04 -0400 (0:00:00.188) 0:00:43.008 ***** 41445 1727204224.22086: entering _queue_task() for managed-node3/set_fact 41445 1727204224.23093: worker is 1 (out of 1 available) 41445 1727204224.23106: exiting _queue_task() for managed-node3/set_fact 41445 1727204224.23154: done queuing things up, now waiting for results queue to drain 41445 1727204224.23156: waiting for pending results... 41445 1727204224.23409: running TaskExecutor() for managed-node3/TASK: Initialize NM profile exist and ansible_managed comment flag 41445 1727204224.23626: in run() - task 028d2410-947f-bf02-eee4-00000000087b 41445 1727204224.23631: variable 'ansible_search_path' from source: unknown 41445 1727204224.23634: variable 'ansible_search_path' from source: unknown 41445 1727204224.23641: calling self._execute() 41445 1727204224.23746: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204224.23750: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204224.23753: variable 'omit' from source: magic vars 41445 1727204224.24287: variable 'ansible_distribution_major_version' from source: facts 41445 1727204224.24293: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204224.24296: variable 'omit' from source: magic vars 41445 1727204224.24538: variable 'omit' from source: magic vars 41445 1727204224.24585: variable 'omit' from source: magic vars 41445 1727204224.24624: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41445 1727204224.24661: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41445 1727204224.24682: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41445 1727204224.24700: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204224.24716: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204224.24744: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41445 1727204224.24748: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204224.24751: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204224.25059: Set connection var ansible_shell_executable to /bin/sh 41445 1727204224.25063: Set connection var ansible_shell_type to sh 41445 1727204224.25065: Set connection var ansible_pipelining to False 41445 1727204224.25076: Set connection var ansible_timeout to 10 41445 1727204224.25079: Set connection var ansible_connection to ssh 41445 1727204224.25196: Set connection var ansible_module_compression to ZIP_DEFLATED 41445 1727204224.25216: variable 'ansible_shell_executable' from source: unknown 41445 1727204224.25219: variable 'ansible_connection' from source: unknown 41445 1727204224.25223: variable 'ansible_module_compression' from source: unknown 41445 1727204224.25226: variable 'ansible_shell_type' from source: unknown 41445 1727204224.25228: variable 'ansible_shell_executable' from source: unknown 41445 1727204224.25231: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204224.25233: variable 'ansible_pipelining' from source: unknown 41445 1727204224.25235: variable 'ansible_timeout' from source: unknown 41445 1727204224.25237: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204224.25591: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41445 1727204224.25603: variable 'omit' from source: magic vars 41445 1727204224.25608: starting attempt loop 41445 1727204224.25613: running the handler 41445 1727204224.25624: handler run complete 41445 1727204224.25637: attempt loop complete, returning result 41445 1727204224.25752: _execute() done 41445 1727204224.25755: dumping result to json 41445 1727204224.25758: done dumping result, returning 41445 1727204224.25766: done running TaskExecutor() for managed-node3/TASK: Initialize NM profile exist and ansible_managed comment flag [028d2410-947f-bf02-eee4-00000000087b] 41445 1727204224.25773: sending task result for task 028d2410-947f-bf02-eee4-00000000087b ok: [managed-node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 41445 1727204224.26064: no more pending results, returning what we have 41445 1727204224.26075: results queue empty 41445 1727204224.26078: checking for any_errors_fatal 41445 1727204224.26080: done checking for any_errors_fatal 41445 1727204224.26081: checking for max_fail_percentage 41445 1727204224.26083: done checking for max_fail_percentage 41445 1727204224.26083: checking to see if all hosts have failed and the running result is not ok 41445 1727204224.26084: done checking to see if all hosts have failed 41445 1727204224.26085: getting the remaining hosts for this loop 41445 1727204224.26086: done getting the remaining hosts for this loop 41445 1727204224.26090: getting the next task for host managed-node3 41445 1727204224.26099: done getting next task for host managed-node3 41445 1727204224.26102: ^ task is: TASK: Stat profile file 41445 1727204224.26105: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204224.26111: getting variables 41445 1727204224.26113: in VariableManager get_vars() 41445 1727204224.26144: Calling all_inventory to load vars for managed-node3 41445 1727204224.26147: Calling groups_inventory to load vars for managed-node3 41445 1727204224.26151: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204224.26165: Calling all_plugins_play to load vars for managed-node3 41445 1727204224.26169: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204224.26173: Calling groups_plugins_play to load vars for managed-node3 41445 1727204224.26857: done sending task result for task 028d2410-947f-bf02-eee4-00000000087b 41445 1727204224.26861: WORKER PROCESS EXITING 41445 1727204224.30394: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204224.34052: done with get_vars() 41445 1727204224.34107: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 14:57:04 -0400 (0:00:00.121) 0:00:43.129 ***** 41445 1727204224.34204: entering _queue_task() for managed-node3/stat 41445 1727204224.35255: worker is 1 (out of 1 available) 41445 1727204224.35268: exiting _queue_task() for managed-node3/stat 41445 1727204224.35334: done queuing things up, now waiting for results queue to drain 41445 1727204224.35336: waiting for pending results... 41445 1727204224.36068: running TaskExecutor() for managed-node3/TASK: Stat profile file 41445 1727204224.36530: in run() - task 028d2410-947f-bf02-eee4-00000000087c 41445 1727204224.36533: variable 'ansible_search_path' from source: unknown 41445 1727204224.36537: variable 'ansible_search_path' from source: unknown 41445 1727204224.36714: calling self._execute() 41445 1727204224.36783: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204224.36794: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204224.36821: variable 'omit' from source: magic vars 41445 1727204224.37171: variable 'ansible_distribution_major_version' from source: facts 41445 1727204224.37185: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204224.37189: variable 'omit' from source: magic vars 41445 1727204224.37223: variable 'omit' from source: magic vars 41445 1727204224.37294: variable 'profile' from source: include params 41445 1727204224.37300: variable 'interface' from source: set_fact 41445 1727204224.37347: variable 'interface' from source: set_fact 41445 1727204224.37366: variable 'omit' from source: magic vars 41445 1727204224.37398: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41445 1727204224.37427: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41445 1727204224.37442: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41445 1727204224.37455: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204224.37468: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204224.37494: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41445 1727204224.37497: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204224.37499: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204224.37569: Set connection var ansible_shell_executable to /bin/sh 41445 1727204224.37572: Set connection var ansible_shell_type to sh 41445 1727204224.37575: Set connection var ansible_pipelining to False 41445 1727204224.37586: Set connection var ansible_timeout to 10 41445 1727204224.37588: Set connection var ansible_connection to ssh 41445 1727204224.37595: Set connection var ansible_module_compression to ZIP_DEFLATED 41445 1727204224.37616: variable 'ansible_shell_executable' from source: unknown 41445 1727204224.37619: variable 'ansible_connection' from source: unknown 41445 1727204224.37621: variable 'ansible_module_compression' from source: unknown 41445 1727204224.37623: variable 'ansible_shell_type' from source: unknown 41445 1727204224.37626: variable 'ansible_shell_executable' from source: unknown 41445 1727204224.37629: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204224.37632: variable 'ansible_pipelining' from source: unknown 41445 1727204224.37634: variable 'ansible_timeout' from source: unknown 41445 1727204224.37636: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204224.37793: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 41445 1727204224.37804: variable 'omit' from source: magic vars 41445 1727204224.37809: starting attempt loop 41445 1727204224.37814: running the handler 41445 1727204224.37823: _low_level_execute_command(): starting 41445 1727204224.37830: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41445 1727204224.38328: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204224.38332: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration <<< 41445 1727204224.38336: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204224.38389: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204224.38392: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204224.38439: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204224.40117: stdout chunk (state=3): >>>/root <<< 41445 1727204224.40222: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204224.40245: stderr chunk (state=3): >>><<< 41445 1727204224.40249: stdout chunk (state=3): >>><<< 41445 1727204224.40268: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204224.40280: _low_level_execute_command(): starting 41445 1727204224.40287: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204224.4026706-44286-256946276790854 `" && echo ansible-tmp-1727204224.4026706-44286-256946276790854="` echo /root/.ansible/tmp/ansible-tmp-1727204224.4026706-44286-256946276790854 `" ) && sleep 0' 41445 1727204224.40783: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204224.40789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204224.40802: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration <<< 41445 1727204224.40805: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204224.40844: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204224.40850: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204224.40894: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204224.40930: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204224.42792: stdout chunk (state=3): >>>ansible-tmp-1727204224.4026706-44286-256946276790854=/root/.ansible/tmp/ansible-tmp-1727204224.4026706-44286-256946276790854 <<< 41445 1727204224.42916: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204224.42933: stderr chunk (state=3): >>><<< 41445 1727204224.42936: stdout chunk (state=3): >>><<< 41445 1727204224.42986: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204224.4026706-44286-256946276790854=/root/.ansible/tmp/ansible-tmp-1727204224.4026706-44286-256946276790854 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204224.43005: variable 'ansible_module_compression' from source: unknown 41445 1727204224.43071: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-414450s0ylvj1/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 41445 1727204224.43100: variable 'ansible_facts' from source: unknown 41445 1727204224.43161: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204224.4026706-44286-256946276790854/AnsiballZ_stat.py 41445 1727204224.43266: Sending initial data 41445 1727204224.43270: Sent initial data (153 bytes) 41445 1727204224.43711: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204224.43714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204224.43743: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204224.43746: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204224.43748: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204224.43800: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204224.43858: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204224.43907: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204224.45425: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 41445 1727204224.45448: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41445 1727204224.45509: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41445 1727204224.45600: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-414450s0ylvj1/tmp8f3u_zfp /root/.ansible/tmp/ansible-tmp-1727204224.4026706-44286-256946276790854/AnsiballZ_stat.py <<< 41445 1727204224.45604: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204224.4026706-44286-256946276790854/AnsiballZ_stat.py" <<< 41445 1727204224.45637: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-414450s0ylvj1/tmp8f3u_zfp" to remote "/root/.ansible/tmp/ansible-tmp-1727204224.4026706-44286-256946276790854/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204224.4026706-44286-256946276790854/AnsiballZ_stat.py" <<< 41445 1727204224.46427: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204224.46464: stderr chunk (state=3): >>><<< 41445 1727204224.46468: stdout chunk (state=3): >>><<< 41445 1727204224.46490: done transferring module to remote 41445 1727204224.46499: _low_level_execute_command(): starting 41445 1727204224.46504: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204224.4026706-44286-256946276790854/ /root/.ansible/tmp/ansible-tmp-1727204224.4026706-44286-256946276790854/AnsiballZ_stat.py && sleep 0' 41445 1727204224.46957: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204224.46962: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found <<< 41445 1727204224.46964: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204224.46967: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204224.46972: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204224.47022: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204224.47026: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204224.47063: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204224.48757: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204224.48782: stderr chunk (state=3): >>><<< 41445 1727204224.48785: stdout chunk (state=3): >>><<< 41445 1727204224.48798: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204224.48800: _low_level_execute_command(): starting 41445 1727204224.48805: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204224.4026706-44286-256946276790854/AnsiballZ_stat.py && sleep 0' 41445 1727204224.49251: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204224.49254: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204224.49285: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204224.49288: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204224.49348: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204224.49350: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204224.49394: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204224.64162: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 41445 1727204224.65415: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. <<< 41445 1727204224.65444: stderr chunk (state=3): >>><<< 41445 1727204224.65447: stdout chunk (state=3): >>><<< 41445 1727204224.65465: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. 41445 1727204224.65492: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204224.4026706-44286-256946276790854/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41445 1727204224.65500: _low_level_execute_command(): starting 41445 1727204224.65505: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204224.4026706-44286-256946276790854/ > /dev/null 2>&1 && sleep 0' 41445 1727204224.65940: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204224.65973: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204224.65983: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found <<< 41445 1727204224.65986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204224.65988: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204224.65990: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204224.65999: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204224.66038: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204224.66042: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204224.66048: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204224.66085: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204224.67859: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204224.67883: stderr chunk (state=3): >>><<< 41445 1727204224.67887: stdout chunk (state=3): >>><<< 41445 1727204224.67900: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204224.67911: handler run complete 41445 1727204224.67929: attempt loop complete, returning result 41445 1727204224.67932: _execute() done 41445 1727204224.67934: dumping result to json 41445 1727204224.67938: done dumping result, returning 41445 1727204224.67947: done running TaskExecutor() for managed-node3/TASK: Stat profile file [028d2410-947f-bf02-eee4-00000000087c] 41445 1727204224.67952: sending task result for task 028d2410-947f-bf02-eee4-00000000087c 41445 1727204224.68045: done sending task result for task 028d2410-947f-bf02-eee4-00000000087c 41445 1727204224.68047: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "stat": { "exists": false } } 41445 1727204224.68105: no more pending results, returning what we have 41445 1727204224.68109: results queue empty 41445 1727204224.68109: checking for any_errors_fatal 41445 1727204224.68117: done checking for any_errors_fatal 41445 1727204224.68118: checking for max_fail_percentage 41445 1727204224.68119: done checking for max_fail_percentage 41445 1727204224.68120: checking to see if all hosts have failed and the running result is not ok 41445 1727204224.68121: done checking to see if all hosts have failed 41445 1727204224.68121: getting the remaining hosts for this loop 41445 1727204224.68122: done getting the remaining hosts for this loop 41445 1727204224.68126: getting the next task for host managed-node3 41445 1727204224.68133: done getting next task for host managed-node3 41445 1727204224.68135: ^ task is: TASK: Set NM profile exist flag based on the profile files 41445 1727204224.68138: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204224.68142: getting variables 41445 1727204224.68144: in VariableManager get_vars() 41445 1727204224.68176: Calling all_inventory to load vars for managed-node3 41445 1727204224.68187: Calling groups_inventory to load vars for managed-node3 41445 1727204224.68191: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204224.68201: Calling all_plugins_play to load vars for managed-node3 41445 1727204224.68204: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204224.68207: Calling groups_plugins_play to load vars for managed-node3 41445 1727204224.69059: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204224.70080: done with get_vars() 41445 1727204224.70096: done getting variables 41445 1727204224.70144: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 14:57:04 -0400 (0:00:00.359) 0:00:43.489 ***** 41445 1727204224.70166: entering _queue_task() for managed-node3/set_fact 41445 1727204224.70404: worker is 1 (out of 1 available) 41445 1727204224.70419: exiting _queue_task() for managed-node3/set_fact 41445 1727204224.70429: done queuing things up, now waiting for results queue to drain 41445 1727204224.70431: waiting for pending results... 41445 1727204224.70606: running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag based on the profile files 41445 1727204224.70681: in run() - task 028d2410-947f-bf02-eee4-00000000087d 41445 1727204224.70693: variable 'ansible_search_path' from source: unknown 41445 1727204224.70696: variable 'ansible_search_path' from source: unknown 41445 1727204224.70724: calling self._execute() 41445 1727204224.70807: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204224.70813: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204224.70820: variable 'omit' from source: magic vars 41445 1727204224.71106: variable 'ansible_distribution_major_version' from source: facts 41445 1727204224.71122: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204224.71201: variable 'profile_stat' from source: set_fact 41445 1727204224.71215: Evaluated conditional (profile_stat.stat.exists): False 41445 1727204224.71220: when evaluation is False, skipping this task 41445 1727204224.71225: _execute() done 41445 1727204224.71228: dumping result to json 41445 1727204224.71230: done dumping result, returning 41445 1727204224.71233: done running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag based on the profile files [028d2410-947f-bf02-eee4-00000000087d] 41445 1727204224.71236: sending task result for task 028d2410-947f-bf02-eee4-00000000087d 41445 1727204224.71321: done sending task result for task 028d2410-947f-bf02-eee4-00000000087d 41445 1727204224.71324: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 41445 1727204224.71381: no more pending results, returning what we have 41445 1727204224.71385: results queue empty 41445 1727204224.71387: checking for any_errors_fatal 41445 1727204224.71393: done checking for any_errors_fatal 41445 1727204224.71394: checking for max_fail_percentage 41445 1727204224.71396: done checking for max_fail_percentage 41445 1727204224.71396: checking to see if all hosts have failed and the running result is not ok 41445 1727204224.71397: done checking to see if all hosts have failed 41445 1727204224.71398: getting the remaining hosts for this loop 41445 1727204224.71399: done getting the remaining hosts for this loop 41445 1727204224.71402: getting the next task for host managed-node3 41445 1727204224.71408: done getting next task for host managed-node3 41445 1727204224.71413: ^ task is: TASK: Get NM profile info 41445 1727204224.71416: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204224.71418: getting variables 41445 1727204224.71420: in VariableManager get_vars() 41445 1727204224.71444: Calling all_inventory to load vars for managed-node3 41445 1727204224.71446: Calling groups_inventory to load vars for managed-node3 41445 1727204224.71449: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204224.71461: Calling all_plugins_play to load vars for managed-node3 41445 1727204224.71464: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204224.71467: Calling groups_plugins_play to load vars for managed-node3 41445 1727204224.72249: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204224.73140: done with get_vars() 41445 1727204224.73155: done getting variables 41445 1727204224.73225: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 14:57:04 -0400 (0:00:00.030) 0:00:43.520 ***** 41445 1727204224.73246: entering _queue_task() for managed-node3/shell 41445 1727204224.73247: Creating lock for shell 41445 1727204224.73468: worker is 1 (out of 1 available) 41445 1727204224.73483: exiting _queue_task() for managed-node3/shell 41445 1727204224.73494: done queuing things up, now waiting for results queue to drain 41445 1727204224.73495: waiting for pending results... 41445 1727204224.73668: running TaskExecutor() for managed-node3/TASK: Get NM profile info 41445 1727204224.73750: in run() - task 028d2410-947f-bf02-eee4-00000000087e 41445 1727204224.73763: variable 'ansible_search_path' from source: unknown 41445 1727204224.73766: variable 'ansible_search_path' from source: unknown 41445 1727204224.73794: calling self._execute() 41445 1727204224.73871: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204224.73874: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204224.73885: variable 'omit' from source: magic vars 41445 1727204224.74163: variable 'ansible_distribution_major_version' from source: facts 41445 1727204224.74171: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204224.74178: variable 'omit' from source: magic vars 41445 1727204224.74209: variable 'omit' from source: magic vars 41445 1727204224.74280: variable 'profile' from source: include params 41445 1727204224.74284: variable 'interface' from source: set_fact 41445 1727204224.74337: variable 'interface' from source: set_fact 41445 1727204224.74352: variable 'omit' from source: magic vars 41445 1727204224.74390: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41445 1727204224.74419: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41445 1727204224.74434: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41445 1727204224.74446: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204224.74457: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204224.74482: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41445 1727204224.74487: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204224.74491: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204224.74558: Set connection var ansible_shell_executable to /bin/sh 41445 1727204224.74562: Set connection var ansible_shell_type to sh 41445 1727204224.74565: Set connection var ansible_pipelining to False 41445 1727204224.74572: Set connection var ansible_timeout to 10 41445 1727204224.74574: Set connection var ansible_connection to ssh 41445 1727204224.74583: Set connection var ansible_module_compression to ZIP_DEFLATED 41445 1727204224.74609: variable 'ansible_shell_executable' from source: unknown 41445 1727204224.74612: variable 'ansible_connection' from source: unknown 41445 1727204224.74614: variable 'ansible_module_compression' from source: unknown 41445 1727204224.74617: variable 'ansible_shell_type' from source: unknown 41445 1727204224.74619: variable 'ansible_shell_executable' from source: unknown 41445 1727204224.74621: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204224.74623: variable 'ansible_pipelining' from source: unknown 41445 1727204224.74626: variable 'ansible_timeout' from source: unknown 41445 1727204224.74627: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204224.74731: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41445 1727204224.74741: variable 'omit' from source: magic vars 41445 1727204224.74743: starting attempt loop 41445 1727204224.74746: running the handler 41445 1727204224.74755: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41445 1727204224.74770: _low_level_execute_command(): starting 41445 1727204224.74778: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41445 1727204224.75273: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204224.75304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204224.75312: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204224.75362: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204224.75365: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204224.75367: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204224.75414: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204224.76985: stdout chunk (state=3): >>>/root <<< 41445 1727204224.77083: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204224.77120: stderr chunk (state=3): >>><<< 41445 1727204224.77123: stdout chunk (state=3): >>><<< 41445 1727204224.77139: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204224.77150: _low_level_execute_command(): starting 41445 1727204224.77157: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204224.7713897-44306-269032163966917 `" && echo ansible-tmp-1727204224.7713897-44306-269032163966917="` echo /root/.ansible/tmp/ansible-tmp-1727204224.7713897-44306-269032163966917 `" ) && sleep 0' 41445 1727204224.77580: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204224.77592: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204224.77621: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204224.77624: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found <<< 41445 1727204224.77627: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204224.77671: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204224.77678: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204224.77688: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204224.77728: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204224.79551: stdout chunk (state=3): >>>ansible-tmp-1727204224.7713897-44306-269032163966917=/root/.ansible/tmp/ansible-tmp-1727204224.7713897-44306-269032163966917 <<< 41445 1727204224.79659: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204224.79681: stderr chunk (state=3): >>><<< 41445 1727204224.79685: stdout chunk (state=3): >>><<< 41445 1727204224.79718: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204224.7713897-44306-269032163966917=/root/.ansible/tmp/ansible-tmp-1727204224.7713897-44306-269032163966917 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204224.79738: variable 'ansible_module_compression' from source: unknown 41445 1727204224.79777: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-414450s0ylvj1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 41445 1727204224.79811: variable 'ansible_facts' from source: unknown 41445 1727204224.79867: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204224.7713897-44306-269032163966917/AnsiballZ_command.py 41445 1727204224.79965: Sending initial data 41445 1727204224.79968: Sent initial data (156 bytes) 41445 1727204224.80364: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204224.80397: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found <<< 41445 1727204224.80400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204224.80402: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204224.80447: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204224.80451: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204224.80498: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204224.81989: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 41445 1727204224.81992: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41445 1727204224.82027: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41445 1727204224.82068: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-414450s0ylvj1/tmppnkkb62t /root/.ansible/tmp/ansible-tmp-1727204224.7713897-44306-269032163966917/AnsiballZ_command.py <<< 41445 1727204224.82097: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204224.7713897-44306-269032163966917/AnsiballZ_command.py" <<< 41445 1727204224.82131: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-414450s0ylvj1/tmppnkkb62t" to remote "/root/.ansible/tmp/ansible-tmp-1727204224.7713897-44306-269032163966917/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204224.7713897-44306-269032163966917/AnsiballZ_command.py" <<< 41445 1727204224.82796: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204224.82798: stderr chunk (state=3): >>><<< 41445 1727204224.82800: stdout chunk (state=3): >>><<< 41445 1727204224.82817: done transferring module to remote 41445 1727204224.82829: _low_level_execute_command(): starting 41445 1727204224.82832: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204224.7713897-44306-269032163966917/ /root/.ansible/tmp/ansible-tmp-1727204224.7713897-44306-269032163966917/AnsiballZ_command.py && sleep 0' 41445 1727204224.83430: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204224.83448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found <<< 41445 1727204224.83484: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204224.83492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204224.83548: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204224.83552: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204224.83602: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204224.85303: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204224.85324: stderr chunk (state=3): >>><<< 41445 1727204224.85327: stdout chunk (state=3): >>><<< 41445 1727204224.85342: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204224.85346: _low_level_execute_command(): starting 41445 1727204224.85348: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204224.7713897-44306-269032163966917/AnsiballZ_command.py && sleep 0' 41445 1727204224.85994: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204224.86052: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204224.86088: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204225.02511: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "start": "2024-09-24 14:57:05.007572", "end": "2024-09-24 14:57:05.023975", "delta": "0:00:00.016403", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 41445 1727204225.04274: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.47.22 closed. <<< 41445 1727204225.04280: stdout chunk (state=3): >>><<< 41445 1727204225.04283: stderr chunk (state=3): >>><<< 41445 1727204225.04285: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "start": "2024-09-24 14:57:05.007572", "end": "2024-09-24 14:57:05.023975", "delta": "0:00:00.016403", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.47.22 closed. 41445 1727204225.04289: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204224.7713897-44306-269032163966917/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41445 1727204225.04291: _low_level_execute_command(): starting 41445 1727204225.04294: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204224.7713897-44306-269032163966917/ > /dev/null 2>&1 && sleep 0' 41445 1727204225.05458: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204225.05469: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204225.05496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204225.05522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204225.05526: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 <<< 41445 1727204225.05633: stderr chunk (state=3): >>>debug2: match not found <<< 41445 1727204225.05703: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204225.05707: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204225.05803: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204225.07545: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204225.07577: stderr chunk (state=3): >>><<< 41445 1727204225.07580: stdout chunk (state=3): >>><<< 41445 1727204225.07621: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204225.07780: handler run complete 41445 1727204225.07784: Evaluated conditional (False): False 41445 1727204225.07786: attempt loop complete, returning result 41445 1727204225.07788: _execute() done 41445 1727204225.07790: dumping result to json 41445 1727204225.07792: done dumping result, returning 41445 1727204225.07794: done running TaskExecutor() for managed-node3/TASK: Get NM profile info [028d2410-947f-bf02-eee4-00000000087e] 41445 1727204225.07796: sending task result for task 028d2410-947f-bf02-eee4-00000000087e 41445 1727204225.07872: done sending task result for task 028d2410-947f-bf02-eee4-00000000087e 41445 1727204225.07878: WORKER PROCESS EXITING fatal: [managed-node3]: FAILED! => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "delta": "0:00:00.016403", "end": "2024-09-24 14:57:05.023975", "rc": 1, "start": "2024-09-24 14:57:05.007572" } MSG: non-zero return code ...ignoring 41445 1727204225.08000: no more pending results, returning what we have 41445 1727204225.08004: results queue empty 41445 1727204225.08005: checking for any_errors_fatal 41445 1727204225.08012: done checking for any_errors_fatal 41445 1727204225.08013: checking for max_fail_percentage 41445 1727204225.08015: done checking for max_fail_percentage 41445 1727204225.08016: checking to see if all hosts have failed and the running result is not ok 41445 1727204225.08016: done checking to see if all hosts have failed 41445 1727204225.08017: getting the remaining hosts for this loop 41445 1727204225.08019: done getting the remaining hosts for this loop 41445 1727204225.08022: getting the next task for host managed-node3 41445 1727204225.08029: done getting next task for host managed-node3 41445 1727204225.08031: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 41445 1727204225.08035: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204225.08038: getting variables 41445 1727204225.08039: in VariableManager get_vars() 41445 1727204225.08067: Calling all_inventory to load vars for managed-node3 41445 1727204225.08070: Calling groups_inventory to load vars for managed-node3 41445 1727204225.08073: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204225.08217: Calling all_plugins_play to load vars for managed-node3 41445 1727204225.08222: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204225.08226: Calling groups_plugins_play to load vars for managed-node3 41445 1727204225.09419: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204225.10380: done with get_vars() 41445 1727204225.10399: done getting variables 41445 1727204225.10444: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 14:57:05 -0400 (0:00:00.372) 0:00:43.892 ***** 41445 1727204225.10468: entering _queue_task() for managed-node3/set_fact 41445 1727204225.11020: worker is 1 (out of 1 available) 41445 1727204225.11031: exiting _queue_task() for managed-node3/set_fact 41445 1727204225.11044: done queuing things up, now waiting for results queue to drain 41445 1727204225.11046: waiting for pending results... 41445 1727204225.11597: running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 41445 1727204225.11603: in run() - task 028d2410-947f-bf02-eee4-00000000087f 41445 1727204225.11606: variable 'ansible_search_path' from source: unknown 41445 1727204225.11616: variable 'ansible_search_path' from source: unknown 41445 1727204225.11620: calling self._execute() 41445 1727204225.11722: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204225.11728: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204225.11751: variable 'omit' from source: magic vars 41445 1727204225.12206: variable 'ansible_distribution_major_version' from source: facts 41445 1727204225.12213: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204225.12385: variable 'nm_profile_exists' from source: set_fact 41445 1727204225.12421: Evaluated conditional (nm_profile_exists.rc == 0): False 41445 1727204225.12424: when evaluation is False, skipping this task 41445 1727204225.12427: _execute() done 41445 1727204225.12429: dumping result to json 41445 1727204225.12432: done dumping result, returning 41445 1727204225.12435: done running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [028d2410-947f-bf02-eee4-00000000087f] 41445 1727204225.12474: sending task result for task 028d2410-947f-bf02-eee4-00000000087f skipping: [managed-node3] => { "changed": false, "false_condition": "nm_profile_exists.rc == 0", "skip_reason": "Conditional result was False" } 41445 1727204225.12614: no more pending results, returning what we have 41445 1727204225.12624: results queue empty 41445 1727204225.12626: checking for any_errors_fatal 41445 1727204225.12634: done checking for any_errors_fatal 41445 1727204225.12635: checking for max_fail_percentage 41445 1727204225.12636: done checking for max_fail_percentage 41445 1727204225.12637: checking to see if all hosts have failed and the running result is not ok 41445 1727204225.12638: done checking to see if all hosts have failed 41445 1727204225.12639: getting the remaining hosts for this loop 41445 1727204225.12640: done getting the remaining hosts for this loop 41445 1727204225.12648: getting the next task for host managed-node3 41445 1727204225.12659: done getting next task for host managed-node3 41445 1727204225.12661: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 41445 1727204225.12667: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204225.12672: getting variables 41445 1727204225.12677: in VariableManager get_vars() 41445 1727204225.12712: Calling all_inventory to load vars for managed-node3 41445 1727204225.12716: Calling groups_inventory to load vars for managed-node3 41445 1727204225.12720: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204225.12737: Calling all_plugins_play to load vars for managed-node3 41445 1727204225.12741: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204225.12745: Calling groups_plugins_play to load vars for managed-node3 41445 1727204225.13265: done sending task result for task 028d2410-947f-bf02-eee4-00000000087f 41445 1727204225.13782: WORKER PROCESS EXITING 41445 1727204225.14471: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204225.16105: done with get_vars() 41445 1727204225.16137: done getting variables 41445 1727204225.16201: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 41445 1727204225.16318: variable 'profile' from source: include params 41445 1727204225.16332: variable 'interface' from source: set_fact 41445 1727204225.16422: variable 'interface' from source: set_fact TASK [Get the ansible_managed comment in ifcfg-ethtest0] *********************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 14:57:05 -0400 (0:00:00.059) 0:00:43.952 ***** 41445 1727204225.16454: entering _queue_task() for managed-node3/command 41445 1727204225.17186: worker is 1 (out of 1 available) 41445 1727204225.17200: exiting _queue_task() for managed-node3/command 41445 1727204225.17212: done queuing things up, now waiting for results queue to drain 41445 1727204225.17213: waiting for pending results... 41445 1727204225.17736: running TaskExecutor() for managed-node3/TASK: Get the ansible_managed comment in ifcfg-ethtest0 41445 1727204225.17835: in run() - task 028d2410-947f-bf02-eee4-000000000881 41445 1727204225.17848: variable 'ansible_search_path' from source: unknown 41445 1727204225.17855: variable 'ansible_search_path' from source: unknown 41445 1727204225.17913: calling self._execute() 41445 1727204225.18048: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204225.18053: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204225.18055: variable 'omit' from source: magic vars 41445 1727204225.18485: variable 'ansible_distribution_major_version' from source: facts 41445 1727204225.18494: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204225.18682: variable 'profile_stat' from source: set_fact 41445 1727204225.18701: Evaluated conditional (profile_stat.stat.exists): False 41445 1727204225.18706: when evaluation is False, skipping this task 41445 1727204225.18708: _execute() done 41445 1727204225.18713: dumping result to json 41445 1727204225.18716: done dumping result, returning 41445 1727204225.18718: done running TaskExecutor() for managed-node3/TASK: Get the ansible_managed comment in ifcfg-ethtest0 [028d2410-947f-bf02-eee4-000000000881] 41445 1727204225.18725: sending task result for task 028d2410-947f-bf02-eee4-000000000881 skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 41445 1727204225.18890: no more pending results, returning what we have 41445 1727204225.18894: results queue empty 41445 1727204225.18895: checking for any_errors_fatal 41445 1727204225.18901: done checking for any_errors_fatal 41445 1727204225.18902: checking for max_fail_percentage 41445 1727204225.18904: done checking for max_fail_percentage 41445 1727204225.18904: checking to see if all hosts have failed and the running result is not ok 41445 1727204225.18906: done checking to see if all hosts have failed 41445 1727204225.18906: getting the remaining hosts for this loop 41445 1727204225.18907: done getting the remaining hosts for this loop 41445 1727204225.18920: getting the next task for host managed-node3 41445 1727204225.18928: done getting next task for host managed-node3 41445 1727204225.18930: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 41445 1727204225.18933: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204225.18936: getting variables 41445 1727204225.18938: in VariableManager get_vars() 41445 1727204225.18967: Calling all_inventory to load vars for managed-node3 41445 1727204225.18970: Calling groups_inventory to load vars for managed-node3 41445 1727204225.18973: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204225.18980: done sending task result for task 028d2410-947f-bf02-eee4-000000000881 41445 1727204225.18983: WORKER PROCESS EXITING 41445 1727204225.18993: Calling all_plugins_play to load vars for managed-node3 41445 1727204225.18996: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204225.18998: Calling groups_plugins_play to load vars for managed-node3 41445 1727204225.20288: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204225.21294: done with get_vars() 41445 1727204225.21313: done getting variables 41445 1727204225.21355: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 41445 1727204225.21438: variable 'profile' from source: include params 41445 1727204225.21441: variable 'interface' from source: set_fact 41445 1727204225.21482: variable 'interface' from source: set_fact TASK [Verify the ansible_managed comment in ifcfg-ethtest0] ******************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 14:57:05 -0400 (0:00:00.050) 0:00:44.002 ***** 41445 1727204225.21509: entering _queue_task() for managed-node3/set_fact 41445 1727204225.21876: worker is 1 (out of 1 available) 41445 1727204225.21889: exiting _queue_task() for managed-node3/set_fact 41445 1727204225.21902: done queuing things up, now waiting for results queue to drain 41445 1727204225.21904: waiting for pending results... 41445 1727204225.22162: running TaskExecutor() for managed-node3/TASK: Verify the ansible_managed comment in ifcfg-ethtest0 41445 1727204225.22276: in run() - task 028d2410-947f-bf02-eee4-000000000882 41445 1727204225.22289: variable 'ansible_search_path' from source: unknown 41445 1727204225.22293: variable 'ansible_search_path' from source: unknown 41445 1727204225.22335: calling self._execute() 41445 1727204225.22433: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204225.22436: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204225.22480: variable 'omit' from source: magic vars 41445 1727204225.22974: variable 'ansible_distribution_major_version' from source: facts 41445 1727204225.22979: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204225.22982: variable 'profile_stat' from source: set_fact 41445 1727204225.23281: Evaluated conditional (profile_stat.stat.exists): False 41445 1727204225.23284: when evaluation is False, skipping this task 41445 1727204225.23285: _execute() done 41445 1727204225.23287: dumping result to json 41445 1727204225.23289: done dumping result, returning 41445 1727204225.23291: done running TaskExecutor() for managed-node3/TASK: Verify the ansible_managed comment in ifcfg-ethtest0 [028d2410-947f-bf02-eee4-000000000882] 41445 1727204225.23292: sending task result for task 028d2410-947f-bf02-eee4-000000000882 41445 1727204225.23353: done sending task result for task 028d2410-947f-bf02-eee4-000000000882 41445 1727204225.23356: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 41445 1727204225.23402: no more pending results, returning what we have 41445 1727204225.23406: results queue empty 41445 1727204225.23407: checking for any_errors_fatal 41445 1727204225.23415: done checking for any_errors_fatal 41445 1727204225.23416: checking for max_fail_percentage 41445 1727204225.23418: done checking for max_fail_percentage 41445 1727204225.23418: checking to see if all hosts have failed and the running result is not ok 41445 1727204225.23419: done checking to see if all hosts have failed 41445 1727204225.23420: getting the remaining hosts for this loop 41445 1727204225.23421: done getting the remaining hosts for this loop 41445 1727204225.23425: getting the next task for host managed-node3 41445 1727204225.23432: done getting next task for host managed-node3 41445 1727204225.23435: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 41445 1727204225.23438: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204225.23441: getting variables 41445 1727204225.23443: in VariableManager get_vars() 41445 1727204225.23471: Calling all_inventory to load vars for managed-node3 41445 1727204225.23474: Calling groups_inventory to load vars for managed-node3 41445 1727204225.23480: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204225.23493: Calling all_plugins_play to load vars for managed-node3 41445 1727204225.23497: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204225.23500: Calling groups_plugins_play to load vars for managed-node3 41445 1727204225.24907: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204225.26518: done with get_vars() 41445 1727204225.26543: done getting variables 41445 1727204225.26604: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 41445 1727204225.26720: variable 'profile' from source: include params 41445 1727204225.26724: variable 'interface' from source: set_fact 41445 1727204225.26782: variable 'interface' from source: set_fact TASK [Get the fingerprint comment in ifcfg-ethtest0] *************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 14:57:05 -0400 (0:00:00.053) 0:00:44.055 ***** 41445 1727204225.26817: entering _queue_task() for managed-node3/command 41445 1727204225.27136: worker is 1 (out of 1 available) 41445 1727204225.27149: exiting _queue_task() for managed-node3/command 41445 1727204225.27160: done queuing things up, now waiting for results queue to drain 41445 1727204225.27161: waiting for pending results... 41445 1727204225.27437: running TaskExecutor() for managed-node3/TASK: Get the fingerprint comment in ifcfg-ethtest0 41445 1727204225.27600: in run() - task 028d2410-947f-bf02-eee4-000000000883 41445 1727204225.27604: variable 'ansible_search_path' from source: unknown 41445 1727204225.27607: variable 'ansible_search_path' from source: unknown 41445 1727204225.27640: calling self._execute() 41445 1727204225.27747: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204225.27981: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204225.27984: variable 'omit' from source: magic vars 41445 1727204225.28389: variable 'ansible_distribution_major_version' from source: facts 41445 1727204225.28408: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204225.28545: variable 'profile_stat' from source: set_fact 41445 1727204225.28563: Evaluated conditional (profile_stat.stat.exists): False 41445 1727204225.28568: when evaluation is False, skipping this task 41445 1727204225.28574: _execute() done 41445 1727204225.28584: dumping result to json 41445 1727204225.28591: done dumping result, returning 41445 1727204225.28599: done running TaskExecutor() for managed-node3/TASK: Get the fingerprint comment in ifcfg-ethtest0 [028d2410-947f-bf02-eee4-000000000883] 41445 1727204225.28609: sending task result for task 028d2410-947f-bf02-eee4-000000000883 skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 41445 1727204225.29129: no more pending results, returning what we have 41445 1727204225.29133: results queue empty 41445 1727204225.29135: checking for any_errors_fatal 41445 1727204225.29139: done checking for any_errors_fatal 41445 1727204225.29140: checking for max_fail_percentage 41445 1727204225.29142: done checking for max_fail_percentage 41445 1727204225.29143: checking to see if all hosts have failed and the running result is not ok 41445 1727204225.29144: done checking to see if all hosts have failed 41445 1727204225.29145: getting the remaining hosts for this loop 41445 1727204225.29146: done getting the remaining hosts for this loop 41445 1727204225.29149: getting the next task for host managed-node3 41445 1727204225.29156: done getting next task for host managed-node3 41445 1727204225.29159: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 41445 1727204225.29162: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204225.29166: getting variables 41445 1727204225.29168: in VariableManager get_vars() 41445 1727204225.29198: Calling all_inventory to load vars for managed-node3 41445 1727204225.29202: Calling groups_inventory to load vars for managed-node3 41445 1727204225.29205: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204225.29219: Calling all_plugins_play to load vars for managed-node3 41445 1727204225.29222: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204225.29225: Calling groups_plugins_play to load vars for managed-node3 41445 1727204225.29990: done sending task result for task 028d2410-947f-bf02-eee4-000000000883 41445 1727204225.29994: WORKER PROCESS EXITING 41445 1727204225.30905: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204225.33357: done with get_vars() 41445 1727204225.33390: done getting variables 41445 1727204225.33457: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 41445 1727204225.33574: variable 'profile' from source: include params 41445 1727204225.33584: variable 'interface' from source: set_fact 41445 1727204225.33648: variable 'interface' from source: set_fact TASK [Verify the fingerprint comment in ifcfg-ethtest0] ************************ task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 14:57:05 -0400 (0:00:00.068) 0:00:44.124 ***** 41445 1727204225.33688: entering _queue_task() for managed-node3/set_fact 41445 1727204225.34207: worker is 1 (out of 1 available) 41445 1727204225.34222: exiting _queue_task() for managed-node3/set_fact 41445 1727204225.34233: done queuing things up, now waiting for results queue to drain 41445 1727204225.34234: waiting for pending results... 41445 1727204225.34385: running TaskExecutor() for managed-node3/TASK: Verify the fingerprint comment in ifcfg-ethtest0 41445 1727204225.34534: in run() - task 028d2410-947f-bf02-eee4-000000000884 41445 1727204225.34555: variable 'ansible_search_path' from source: unknown 41445 1727204225.34570: variable 'ansible_search_path' from source: unknown 41445 1727204225.34617: calling self._execute() 41445 1727204225.34730: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204225.34742: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204225.34756: variable 'omit' from source: magic vars 41445 1727204225.35152: variable 'ansible_distribution_major_version' from source: facts 41445 1727204225.35172: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204225.35301: variable 'profile_stat' from source: set_fact 41445 1727204225.35329: Evaluated conditional (profile_stat.stat.exists): False 41445 1727204225.35337: when evaluation is False, skipping this task 41445 1727204225.35344: _execute() done 41445 1727204225.35352: dumping result to json 41445 1727204225.35360: done dumping result, returning 41445 1727204225.35370: done running TaskExecutor() for managed-node3/TASK: Verify the fingerprint comment in ifcfg-ethtest0 [028d2410-947f-bf02-eee4-000000000884] 41445 1727204225.35384: sending task result for task 028d2410-947f-bf02-eee4-000000000884 skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 41445 1727204225.35539: no more pending results, returning what we have 41445 1727204225.35545: results queue empty 41445 1727204225.35546: checking for any_errors_fatal 41445 1727204225.35552: done checking for any_errors_fatal 41445 1727204225.35553: checking for max_fail_percentage 41445 1727204225.35555: done checking for max_fail_percentage 41445 1727204225.35555: checking to see if all hosts have failed and the running result is not ok 41445 1727204225.35557: done checking to see if all hosts have failed 41445 1727204225.35557: getting the remaining hosts for this loop 41445 1727204225.35559: done getting the remaining hosts for this loop 41445 1727204225.35563: getting the next task for host managed-node3 41445 1727204225.35573: done getting next task for host managed-node3 41445 1727204225.35577: ^ task is: TASK: Assert that the profile is absent - '{{ profile }}' 41445 1727204225.35581: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204225.35586: getting variables 41445 1727204225.35587: in VariableManager get_vars() 41445 1727204225.35807: Calling all_inventory to load vars for managed-node3 41445 1727204225.35814: Calling groups_inventory to load vars for managed-node3 41445 1727204225.35818: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204225.35833: Calling all_plugins_play to load vars for managed-node3 41445 1727204225.35836: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204225.35840: Calling groups_plugins_play to load vars for managed-node3 41445 1727204225.36687: done sending task result for task 028d2410-947f-bf02-eee4-000000000884 41445 1727204225.36691: WORKER PROCESS EXITING 41445 1727204225.37549: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204225.38561: done with get_vars() 41445 1727204225.38581: done getting variables 41445 1727204225.38628: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 41445 1727204225.38717: variable 'profile' from source: include params 41445 1727204225.38721: variable 'interface' from source: set_fact 41445 1727204225.38760: variable 'interface' from source: set_fact TASK [Assert that the profile is absent - 'ethtest0'] ************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:5 Tuesday 24 September 2024 14:57:05 -0400 (0:00:00.050) 0:00:44.175 ***** 41445 1727204225.38785: entering _queue_task() for managed-node3/assert 41445 1727204225.39159: worker is 1 (out of 1 available) 41445 1727204225.39173: exiting _queue_task() for managed-node3/assert 41445 1727204225.39187: done queuing things up, now waiting for results queue to drain 41445 1727204225.39189: waiting for pending results... 41445 1727204225.39695: running TaskExecutor() for managed-node3/TASK: Assert that the profile is absent - 'ethtest0' 41445 1727204225.39750: in run() - task 028d2410-947f-bf02-eee4-00000000086d 41445 1727204225.39754: variable 'ansible_search_path' from source: unknown 41445 1727204225.39757: variable 'ansible_search_path' from source: unknown 41445 1727204225.39760: calling self._execute() 41445 1727204225.39762: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204225.39765: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204225.39971: variable 'omit' from source: magic vars 41445 1727204225.40893: variable 'ansible_distribution_major_version' from source: facts 41445 1727204225.40914: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204225.40923: variable 'omit' from source: magic vars 41445 1727204225.40960: variable 'omit' from source: magic vars 41445 1727204225.41134: variable 'profile' from source: include params 41445 1727204225.41138: variable 'interface' from source: set_fact 41445 1727204225.41234: variable 'interface' from source: set_fact 41445 1727204225.41255: variable 'omit' from source: magic vars 41445 1727204225.41294: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41445 1727204225.41324: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41445 1727204225.41345: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41445 1727204225.41369: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204225.41400: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204225.41425: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41445 1727204225.41428: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204225.41431: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204225.41508: Set connection var ansible_shell_executable to /bin/sh 41445 1727204225.41511: Set connection var ansible_shell_type to sh 41445 1727204225.41518: Set connection var ansible_pipelining to False 41445 1727204225.41525: Set connection var ansible_timeout to 10 41445 1727204225.41527: Set connection var ansible_connection to ssh 41445 1727204225.41534: Set connection var ansible_module_compression to ZIP_DEFLATED 41445 1727204225.41552: variable 'ansible_shell_executable' from source: unknown 41445 1727204225.41555: variable 'ansible_connection' from source: unknown 41445 1727204225.41558: variable 'ansible_module_compression' from source: unknown 41445 1727204225.41561: variable 'ansible_shell_type' from source: unknown 41445 1727204225.41563: variable 'ansible_shell_executable' from source: unknown 41445 1727204225.41566: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204225.41568: variable 'ansible_pipelining' from source: unknown 41445 1727204225.41571: variable 'ansible_timeout' from source: unknown 41445 1727204225.41573: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204225.41688: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41445 1727204225.41701: variable 'omit' from source: magic vars 41445 1727204225.41704: starting attempt loop 41445 1727204225.41706: running the handler 41445 1727204225.41795: variable 'lsr_net_profile_exists' from source: set_fact 41445 1727204225.41798: Evaluated conditional (not lsr_net_profile_exists): True 41445 1727204225.41808: handler run complete 41445 1727204225.41821: attempt loop complete, returning result 41445 1727204225.41823: _execute() done 41445 1727204225.41826: dumping result to json 41445 1727204225.41829: done dumping result, returning 41445 1727204225.41839: done running TaskExecutor() for managed-node3/TASK: Assert that the profile is absent - 'ethtest0' [028d2410-947f-bf02-eee4-00000000086d] 41445 1727204225.41841: sending task result for task 028d2410-947f-bf02-eee4-00000000086d 41445 1727204225.41922: done sending task result for task 028d2410-947f-bf02-eee4-00000000086d 41445 1727204225.41925: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 41445 1727204225.41983: no more pending results, returning what we have 41445 1727204225.41987: results queue empty 41445 1727204225.41987: checking for any_errors_fatal 41445 1727204225.41993: done checking for any_errors_fatal 41445 1727204225.41994: checking for max_fail_percentage 41445 1727204225.41996: done checking for max_fail_percentage 41445 1727204225.41996: checking to see if all hosts have failed and the running result is not ok 41445 1727204225.41997: done checking to see if all hosts have failed 41445 1727204225.41998: getting the remaining hosts for this loop 41445 1727204225.41999: done getting the remaining hosts for this loop 41445 1727204225.42002: getting the next task for host managed-node3 41445 1727204225.42011: done getting next task for host managed-node3 41445 1727204225.42014: ^ task is: TASK: Include the task 'assert_device_absent.yml' 41445 1727204225.42016: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204225.42019: getting variables 41445 1727204225.42021: in VariableManager get_vars() 41445 1727204225.42058: Calling all_inventory to load vars for managed-node3 41445 1727204225.42061: Calling groups_inventory to load vars for managed-node3 41445 1727204225.42065: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204225.42079: Calling all_plugins_play to load vars for managed-node3 41445 1727204225.42082: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204225.42085: Calling groups_plugins_play to load vars for managed-node3 41445 1727204225.42915: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204225.43810: done with get_vars() 41445 1727204225.43828: done getting variables TASK [Include the task 'assert_device_absent.yml'] ***************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:156 Tuesday 24 September 2024 14:57:05 -0400 (0:00:00.051) 0:00:44.226 ***** 41445 1727204225.43897: entering _queue_task() for managed-node3/include_tasks 41445 1727204225.44168: worker is 1 (out of 1 available) 41445 1727204225.44187: exiting _queue_task() for managed-node3/include_tasks 41445 1727204225.44199: done queuing things up, now waiting for results queue to drain 41445 1727204225.44200: waiting for pending results... 41445 1727204225.44739: running TaskExecutor() for managed-node3/TASK: Include the task 'assert_device_absent.yml' 41445 1727204225.44744: in run() - task 028d2410-947f-bf02-eee4-0000000000f0 41445 1727204225.44839: variable 'ansible_search_path' from source: unknown 41445 1727204225.44845: calling self._execute() 41445 1727204225.44848: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204225.44851: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204225.44853: variable 'omit' from source: magic vars 41445 1727204225.45182: variable 'ansible_distribution_major_version' from source: facts 41445 1727204225.45187: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204225.45215: _execute() done 41445 1727204225.45218: dumping result to json 41445 1727204225.45221: done dumping result, returning 41445 1727204225.45223: done running TaskExecutor() for managed-node3/TASK: Include the task 'assert_device_absent.yml' [028d2410-947f-bf02-eee4-0000000000f0] 41445 1727204225.45225: sending task result for task 028d2410-947f-bf02-eee4-0000000000f0 41445 1727204225.45347: done sending task result for task 028d2410-947f-bf02-eee4-0000000000f0 41445 1727204225.45351: WORKER PROCESS EXITING 41445 1727204225.45505: no more pending results, returning what we have 41445 1727204225.45513: in VariableManager get_vars() 41445 1727204225.45551: Calling all_inventory to load vars for managed-node3 41445 1727204225.45554: Calling groups_inventory to load vars for managed-node3 41445 1727204225.45558: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204225.45573: Calling all_plugins_play to load vars for managed-node3 41445 1727204225.45579: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204225.45583: Calling groups_plugins_play to load vars for managed-node3 41445 1727204225.48454: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204225.51106: done with get_vars() 41445 1727204225.51137: variable 'ansible_search_path' from source: unknown 41445 1727204225.51154: we have included files to process 41445 1727204225.51156: generating all_blocks data 41445 1727204225.51158: done generating all_blocks data 41445 1727204225.51164: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 41445 1727204225.51165: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 41445 1727204225.51168: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 41445 1727204225.51546: in VariableManager get_vars() 41445 1727204225.51565: done with get_vars() 41445 1727204225.51882: done processing included file 41445 1727204225.51884: iterating over new_blocks loaded from include file 41445 1727204225.51886: in VariableManager get_vars() 41445 1727204225.51898: done with get_vars() 41445 1727204225.51900: filtering new block on tags 41445 1727204225.51919: done filtering new block on tags 41445 1727204225.51921: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed-node3 41445 1727204225.51927: extending task lists for all hosts with included blocks 41445 1727204225.52419: done extending task lists 41445 1727204225.52420: done processing included files 41445 1727204225.52421: results queue empty 41445 1727204225.52422: checking for any_errors_fatal 41445 1727204225.52425: done checking for any_errors_fatal 41445 1727204225.52426: checking for max_fail_percentage 41445 1727204225.52427: done checking for max_fail_percentage 41445 1727204225.52427: checking to see if all hosts have failed and the running result is not ok 41445 1727204225.52428: done checking to see if all hosts have failed 41445 1727204225.52428: getting the remaining hosts for this loop 41445 1727204225.52430: done getting the remaining hosts for this loop 41445 1727204225.52432: getting the next task for host managed-node3 41445 1727204225.52436: done getting next task for host managed-node3 41445 1727204225.52438: ^ task is: TASK: Include the task 'get_interface_stat.yml' 41445 1727204225.52440: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204225.52442: getting variables 41445 1727204225.52443: in VariableManager get_vars() 41445 1727204225.52452: Calling all_inventory to load vars for managed-node3 41445 1727204225.52455: Calling groups_inventory to load vars for managed-node3 41445 1727204225.52457: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204225.52462: Calling all_plugins_play to load vars for managed-node3 41445 1727204225.52465: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204225.52468: Calling groups_plugins_play to load vars for managed-node3 41445 1727204225.54927: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204225.56818: done with get_vars() 41445 1727204225.56848: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Tuesday 24 September 2024 14:57:05 -0400 (0:00:00.130) 0:00:44.357 ***** 41445 1727204225.56936: entering _queue_task() for managed-node3/include_tasks 41445 1727204225.57621: worker is 1 (out of 1 available) 41445 1727204225.57633: exiting _queue_task() for managed-node3/include_tasks 41445 1727204225.57646: done queuing things up, now waiting for results queue to drain 41445 1727204225.57647: waiting for pending results... 41445 1727204225.58183: running TaskExecutor() for managed-node3/TASK: Include the task 'get_interface_stat.yml' 41445 1727204225.58458: in run() - task 028d2410-947f-bf02-eee4-0000000008b5 41445 1727204225.58592: variable 'ansible_search_path' from source: unknown 41445 1727204225.58595: variable 'ansible_search_path' from source: unknown 41445 1727204225.58660: calling self._execute() 41445 1727204225.59008: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204225.59020: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204225.59095: variable 'omit' from source: magic vars 41445 1727204225.59509: variable 'ansible_distribution_major_version' from source: facts 41445 1727204225.59526: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204225.59530: _execute() done 41445 1727204225.59535: dumping result to json 41445 1727204225.59542: done dumping result, returning 41445 1727204225.59546: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_interface_stat.yml' [028d2410-947f-bf02-eee4-0000000008b5] 41445 1727204225.59636: sending task result for task 028d2410-947f-bf02-eee4-0000000008b5 41445 1727204225.59708: done sending task result for task 028d2410-947f-bf02-eee4-0000000008b5 41445 1727204225.59712: WORKER PROCESS EXITING 41445 1727204225.59741: no more pending results, returning what we have 41445 1727204225.59748: in VariableManager get_vars() 41445 1727204225.59789: Calling all_inventory to load vars for managed-node3 41445 1727204225.59793: Calling groups_inventory to load vars for managed-node3 41445 1727204225.59797: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204225.59816: Calling all_plugins_play to load vars for managed-node3 41445 1727204225.59820: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204225.59824: Calling groups_plugins_play to load vars for managed-node3 41445 1727204225.70364: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204225.73203: done with get_vars() 41445 1727204225.73227: variable 'ansible_search_path' from source: unknown 41445 1727204225.73229: variable 'ansible_search_path' from source: unknown 41445 1727204225.73257: we have included files to process 41445 1727204225.73258: generating all_blocks data 41445 1727204225.73259: done generating all_blocks data 41445 1727204225.73259: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 41445 1727204225.73260: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 41445 1727204225.73261: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 41445 1727204225.73405: done processing included file 41445 1727204225.73407: iterating over new_blocks loaded from include file 41445 1727204225.73408: in VariableManager get_vars() 41445 1727204225.73419: done with get_vars() 41445 1727204225.73421: filtering new block on tags 41445 1727204225.73430: done filtering new block on tags 41445 1727204225.73431: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node3 41445 1727204225.73434: extending task lists for all hosts with included blocks 41445 1727204225.73539: done extending task lists 41445 1727204225.73540: done processing included files 41445 1727204225.73541: results queue empty 41445 1727204225.73542: checking for any_errors_fatal 41445 1727204225.73544: done checking for any_errors_fatal 41445 1727204225.73544: checking for max_fail_percentage 41445 1727204225.73545: done checking for max_fail_percentage 41445 1727204225.73545: checking to see if all hosts have failed and the running result is not ok 41445 1727204225.73546: done checking to see if all hosts have failed 41445 1727204225.73546: getting the remaining hosts for this loop 41445 1727204225.73547: done getting the remaining hosts for this loop 41445 1727204225.73549: getting the next task for host managed-node3 41445 1727204225.73552: done getting next task for host managed-node3 41445 1727204225.73553: ^ task is: TASK: Get stat for interface {{ interface }} 41445 1727204225.73555: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204225.73556: getting variables 41445 1727204225.73557: in VariableManager get_vars() 41445 1727204225.73563: Calling all_inventory to load vars for managed-node3 41445 1727204225.73565: Calling groups_inventory to load vars for managed-node3 41445 1727204225.73566: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204225.73570: Calling all_plugins_play to load vars for managed-node3 41445 1727204225.73572: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204225.73574: Calling groups_plugins_play to load vars for managed-node3 41445 1727204225.74308: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204225.75221: done with get_vars() 41445 1727204225.75240: done getting variables 41445 1727204225.75346: variable 'interface' from source: set_fact TASK [Get stat for interface ethtest0] ***************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 14:57:05 -0400 (0:00:00.184) 0:00:44.541 ***** 41445 1727204225.75367: entering _queue_task() for managed-node3/stat 41445 1727204225.75671: worker is 1 (out of 1 available) 41445 1727204225.75685: exiting _queue_task() for managed-node3/stat 41445 1727204225.75697: done queuing things up, now waiting for results queue to drain 41445 1727204225.75698: waiting for pending results... 41445 1727204225.75943: running TaskExecutor() for managed-node3/TASK: Get stat for interface ethtest0 41445 1727204225.76091: in run() - task 028d2410-947f-bf02-eee4-0000000008cf 41445 1727204225.76096: variable 'ansible_search_path' from source: unknown 41445 1727204225.76100: variable 'ansible_search_path' from source: unknown 41445 1727204225.76148: calling self._execute() 41445 1727204225.76382: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204225.76386: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204225.76390: variable 'omit' from source: magic vars 41445 1727204225.76665: variable 'ansible_distribution_major_version' from source: facts 41445 1727204225.76678: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204225.76685: variable 'omit' from source: magic vars 41445 1727204225.76738: variable 'omit' from source: magic vars 41445 1727204225.76847: variable 'interface' from source: set_fact 41445 1727204225.76861: variable 'omit' from source: magic vars 41445 1727204225.76906: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41445 1727204225.76935: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41445 1727204225.76957: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41445 1727204225.76971: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204225.76982: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204225.77021: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41445 1727204225.77025: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204225.77028: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204225.77114: Set connection var ansible_shell_executable to /bin/sh 41445 1727204225.77117: Set connection var ansible_shell_type to sh 41445 1727204225.77120: Set connection var ansible_pipelining to False 41445 1727204225.77127: Set connection var ansible_timeout to 10 41445 1727204225.77130: Set connection var ansible_connection to ssh 41445 1727204225.77136: Set connection var ansible_module_compression to ZIP_DEFLATED 41445 1727204225.77159: variable 'ansible_shell_executable' from source: unknown 41445 1727204225.77165: variable 'ansible_connection' from source: unknown 41445 1727204225.77168: variable 'ansible_module_compression' from source: unknown 41445 1727204225.77190: variable 'ansible_shell_type' from source: unknown 41445 1727204225.77193: variable 'ansible_shell_executable' from source: unknown 41445 1727204225.77196: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204225.77198: variable 'ansible_pipelining' from source: unknown 41445 1727204225.77200: variable 'ansible_timeout' from source: unknown 41445 1727204225.77203: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204225.77447: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 41445 1727204225.77452: variable 'omit' from source: magic vars 41445 1727204225.77455: starting attempt loop 41445 1727204225.77457: running the handler 41445 1727204225.77460: _low_level_execute_command(): starting 41445 1727204225.77462: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41445 1727204225.78338: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204225.78342: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204225.78347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204225.78351: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204225.78353: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 <<< 41445 1727204225.78356: stderr chunk (state=3): >>>debug2: match not found <<< 41445 1727204225.78371: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204225.78386: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41445 1727204225.78396: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.22 is address <<< 41445 1727204225.78408: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41445 1727204225.78431: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204225.78435: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found <<< 41445 1727204225.78437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204225.78505: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204225.78509: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204225.78524: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204225.78583: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204225.80241: stdout chunk (state=3): >>>/root <<< 41445 1727204225.80360: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204225.80405: stderr chunk (state=3): >>><<< 41445 1727204225.80408: stdout chunk (state=3): >>><<< 41445 1727204225.80436: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204225.80450: _low_level_execute_command(): starting 41445 1727204225.80455: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204225.8042963-44366-198418364436623 `" && echo ansible-tmp-1727204225.8042963-44366-198418364436623="` echo /root/.ansible/tmp/ansible-tmp-1727204225.8042963-44366-198418364436623 `" ) && sleep 0' 41445 1727204225.81065: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204225.81068: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 <<< 41445 1727204225.81073: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204225.81077: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204225.81083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204225.81126: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204225.81129: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204225.81199: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204225.83056: stdout chunk (state=3): >>>ansible-tmp-1727204225.8042963-44366-198418364436623=/root/.ansible/tmp/ansible-tmp-1727204225.8042963-44366-198418364436623 <<< 41445 1727204225.83184: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204225.83188: stderr chunk (state=3): >>><<< 41445 1727204225.83191: stdout chunk (state=3): >>><<< 41445 1727204225.83217: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204225.8042963-44366-198418364436623=/root/.ansible/tmp/ansible-tmp-1727204225.8042963-44366-198418364436623 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204225.83262: variable 'ansible_module_compression' from source: unknown 41445 1727204225.83354: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-414450s0ylvj1/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 41445 1727204225.83369: variable 'ansible_facts' from source: unknown 41445 1727204225.83434: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204225.8042963-44366-198418364436623/AnsiballZ_stat.py 41445 1727204225.83581: Sending initial data 41445 1727204225.83586: Sent initial data (153 bytes) 41445 1727204225.84099: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204225.84111: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204225.84123: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204225.84205: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204225.84219: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204225.84261: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204225.85811: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 41445 1727204225.85826: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41445 1727204225.85847: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41445 1727204225.85879: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-414450s0ylvj1/tmplvh6q8kb /root/.ansible/tmp/ansible-tmp-1727204225.8042963-44366-198418364436623/AnsiballZ_stat.py <<< 41445 1727204225.85885: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204225.8042963-44366-198418364436623/AnsiballZ_stat.py" <<< 41445 1727204225.85910: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-414450s0ylvj1/tmplvh6q8kb" to remote "/root/.ansible/tmp/ansible-tmp-1727204225.8042963-44366-198418364436623/AnsiballZ_stat.py" <<< 41445 1727204225.85918: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204225.8042963-44366-198418364436623/AnsiballZ_stat.py" <<< 41445 1727204225.86412: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204225.86477: stderr chunk (state=3): >>><<< 41445 1727204225.86485: stdout chunk (state=3): >>><<< 41445 1727204225.86495: done transferring module to remote 41445 1727204225.86513: _low_level_execute_command(): starting 41445 1727204225.86516: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204225.8042963-44366-198418364436623/ /root/.ansible/tmp/ansible-tmp-1727204225.8042963-44366-198418364436623/AnsiballZ_stat.py && sleep 0' 41445 1727204225.87071: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204225.87074: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found <<< 41445 1727204225.87089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration <<< 41445 1727204225.87092: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found <<< 41445 1727204225.87095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204225.87099: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204225.87131: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204225.87193: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204225.89088: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204225.89092: stdout chunk (state=3): >>><<< 41445 1727204225.89094: stderr chunk (state=3): >>><<< 41445 1727204225.89097: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204225.89099: _low_level_execute_command(): starting 41445 1727204225.89106: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204225.8042963-44366-198418364436623/AnsiballZ_stat.py && sleep 0' 41445 1727204225.89669: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204225.89679: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204225.89707: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204225.89710: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204225.89713: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204225.89715: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204225.89823: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204225.89840: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204225.89856: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204226.04821: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 41445 1727204226.06082: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. <<< 41445 1727204226.06085: stderr chunk (state=3): >>><<< 41445 1727204226.06088: stdout chunk (state=3): >>><<< 41445 1727204226.06119: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. 41445 1727204226.06141: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204225.8042963-44366-198418364436623/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41445 1727204226.06149: _low_level_execute_command(): starting 41445 1727204226.06154: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204225.8042963-44366-198418364436623/ > /dev/null 2>&1 && sleep 0' 41445 1727204226.06945: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204226.07012: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204226.07070: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204226.07215: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204226.08938: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204226.08978: stdout chunk (state=3): >>><<< 41445 1727204226.08982: stderr chunk (state=3): >>><<< 41445 1727204226.09071: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204226.09074: handler run complete 41445 1727204226.09079: attempt loop complete, returning result 41445 1727204226.09081: _execute() done 41445 1727204226.09083: dumping result to json 41445 1727204226.09085: done dumping result, returning 41445 1727204226.09088: done running TaskExecutor() for managed-node3/TASK: Get stat for interface ethtest0 [028d2410-947f-bf02-eee4-0000000008cf] 41445 1727204226.09091: sending task result for task 028d2410-947f-bf02-eee4-0000000008cf 41445 1727204226.09155: done sending task result for task 028d2410-947f-bf02-eee4-0000000008cf 41445 1727204226.09158: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "stat": { "exists": false } } 41445 1727204226.09231: no more pending results, returning what we have 41445 1727204226.09235: results queue empty 41445 1727204226.09236: checking for any_errors_fatal 41445 1727204226.09237: done checking for any_errors_fatal 41445 1727204226.09238: checking for max_fail_percentage 41445 1727204226.09240: done checking for max_fail_percentage 41445 1727204226.09240: checking to see if all hosts have failed and the running result is not ok 41445 1727204226.09241: done checking to see if all hosts have failed 41445 1727204226.09242: getting the remaining hosts for this loop 41445 1727204226.09243: done getting the remaining hosts for this loop 41445 1727204226.09248: getting the next task for host managed-node3 41445 1727204226.09259: done getting next task for host managed-node3 41445 1727204226.09262: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 41445 1727204226.09264: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204226.09383: getting variables 41445 1727204226.09386: in VariableManager get_vars() 41445 1727204226.09420: Calling all_inventory to load vars for managed-node3 41445 1727204226.09424: Calling groups_inventory to load vars for managed-node3 41445 1727204226.09427: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204226.09440: Calling all_plugins_play to load vars for managed-node3 41445 1727204226.09443: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204226.09446: Calling groups_plugins_play to load vars for managed-node3 41445 1727204226.11969: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204226.12920: done with get_vars() 41445 1727204226.12937: done getting variables 41445 1727204226.12986: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 41445 1727204226.13078: variable 'interface' from source: set_fact TASK [Assert that the interface is absent - 'ethtest0'] ************************ task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Tuesday 24 September 2024 14:57:06 -0400 (0:00:00.377) 0:00:44.918 ***** 41445 1727204226.13102: entering _queue_task() for managed-node3/assert 41445 1727204226.13356: worker is 1 (out of 1 available) 41445 1727204226.13370: exiting _queue_task() for managed-node3/assert 41445 1727204226.13384: done queuing things up, now waiting for results queue to drain 41445 1727204226.13386: waiting for pending results... 41445 1727204226.13563: running TaskExecutor() for managed-node3/TASK: Assert that the interface is absent - 'ethtest0' 41445 1727204226.13637: in run() - task 028d2410-947f-bf02-eee4-0000000008b6 41445 1727204226.13648: variable 'ansible_search_path' from source: unknown 41445 1727204226.13651: variable 'ansible_search_path' from source: unknown 41445 1727204226.13680: calling self._execute() 41445 1727204226.13762: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204226.13766: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204226.13774: variable 'omit' from source: magic vars 41445 1727204226.14060: variable 'ansible_distribution_major_version' from source: facts 41445 1727204226.14071: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204226.14079: variable 'omit' from source: magic vars 41445 1727204226.14108: variable 'omit' from source: magic vars 41445 1727204226.14187: variable 'interface' from source: set_fact 41445 1727204226.14201: variable 'omit' from source: magic vars 41445 1727204226.14236: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41445 1727204226.14263: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41445 1727204226.14281: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41445 1727204226.14295: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204226.14305: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204226.14330: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41445 1727204226.14333: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204226.14336: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204226.14405: Set connection var ansible_shell_executable to /bin/sh 41445 1727204226.14408: Set connection var ansible_shell_type to sh 41445 1727204226.14412: Set connection var ansible_pipelining to False 41445 1727204226.14421: Set connection var ansible_timeout to 10 41445 1727204226.14424: Set connection var ansible_connection to ssh 41445 1727204226.14430: Set connection var ansible_module_compression to ZIP_DEFLATED 41445 1727204226.14448: variable 'ansible_shell_executable' from source: unknown 41445 1727204226.14451: variable 'ansible_connection' from source: unknown 41445 1727204226.14454: variable 'ansible_module_compression' from source: unknown 41445 1727204226.14456: variable 'ansible_shell_type' from source: unknown 41445 1727204226.14459: variable 'ansible_shell_executable' from source: unknown 41445 1727204226.14461: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204226.14463: variable 'ansible_pipelining' from source: unknown 41445 1727204226.14465: variable 'ansible_timeout' from source: unknown 41445 1727204226.14470: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204226.14572: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41445 1727204226.14584: variable 'omit' from source: magic vars 41445 1727204226.14587: starting attempt loop 41445 1727204226.14590: running the handler 41445 1727204226.14688: variable 'interface_stat' from source: set_fact 41445 1727204226.14696: Evaluated conditional (not interface_stat.stat.exists): True 41445 1727204226.14704: handler run complete 41445 1727204226.14717: attempt loop complete, returning result 41445 1727204226.14720: _execute() done 41445 1727204226.14722: dumping result to json 41445 1727204226.14725: done dumping result, returning 41445 1727204226.14731: done running TaskExecutor() for managed-node3/TASK: Assert that the interface is absent - 'ethtest0' [028d2410-947f-bf02-eee4-0000000008b6] 41445 1727204226.14737: sending task result for task 028d2410-947f-bf02-eee4-0000000008b6 41445 1727204226.14822: done sending task result for task 028d2410-947f-bf02-eee4-0000000008b6 41445 1727204226.14825: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 41445 1727204226.14939: no more pending results, returning what we have 41445 1727204226.14942: results queue empty 41445 1727204226.14943: checking for any_errors_fatal 41445 1727204226.14950: done checking for any_errors_fatal 41445 1727204226.14950: checking for max_fail_percentage 41445 1727204226.14952: done checking for max_fail_percentage 41445 1727204226.14953: checking to see if all hosts have failed and the running result is not ok 41445 1727204226.14953: done checking to see if all hosts have failed 41445 1727204226.14955: getting the remaining hosts for this loop 41445 1727204226.14956: done getting the remaining hosts for this loop 41445 1727204226.14959: getting the next task for host managed-node3 41445 1727204226.14967: done getting next task for host managed-node3 41445 1727204226.14970: ^ task is: TASK: Verify network state restored to default 41445 1727204226.14971: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204226.14974: getting variables 41445 1727204226.14977: in VariableManager get_vars() 41445 1727204226.15001: Calling all_inventory to load vars for managed-node3 41445 1727204226.15003: Calling groups_inventory to load vars for managed-node3 41445 1727204226.15006: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204226.15015: Calling all_plugins_play to load vars for managed-node3 41445 1727204226.15018: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204226.15020: Calling groups_plugins_play to load vars for managed-node3 41445 1727204226.16331: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204226.18995: done with get_vars() 41445 1727204226.19025: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:158 Tuesday 24 September 2024 14:57:06 -0400 (0:00:00.060) 0:00:44.979 ***** 41445 1727204226.19149: entering _queue_task() for managed-node3/include_tasks 41445 1727204226.19996: worker is 1 (out of 1 available) 41445 1727204226.20007: exiting _queue_task() for managed-node3/include_tasks 41445 1727204226.20019: done queuing things up, now waiting for results queue to drain 41445 1727204226.20020: waiting for pending results... 41445 1727204226.20193: running TaskExecutor() for managed-node3/TASK: Verify network state restored to default 41445 1727204226.20199: in run() - task 028d2410-947f-bf02-eee4-0000000000f1 41445 1727204226.20203: variable 'ansible_search_path' from source: unknown 41445 1727204226.20239: calling self._execute() 41445 1727204226.20347: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204226.20360: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204226.20456: variable 'omit' from source: magic vars 41445 1727204226.20747: variable 'ansible_distribution_major_version' from source: facts 41445 1727204226.20758: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204226.20768: _execute() done 41445 1727204226.20771: dumping result to json 41445 1727204226.20773: done dumping result, returning 41445 1727204226.20778: done running TaskExecutor() for managed-node3/TASK: Verify network state restored to default [028d2410-947f-bf02-eee4-0000000000f1] 41445 1727204226.20783: sending task result for task 028d2410-947f-bf02-eee4-0000000000f1 41445 1727204226.20866: done sending task result for task 028d2410-947f-bf02-eee4-0000000000f1 41445 1727204226.20877: WORKER PROCESS EXITING 41445 1727204226.20917: no more pending results, returning what we have 41445 1727204226.20922: in VariableManager get_vars() 41445 1727204226.20955: Calling all_inventory to load vars for managed-node3 41445 1727204226.20958: Calling groups_inventory to load vars for managed-node3 41445 1727204226.20961: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204226.20974: Calling all_plugins_play to load vars for managed-node3 41445 1727204226.20979: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204226.20982: Calling groups_plugins_play to load vars for managed-node3 41445 1727204226.22407: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204226.24472: done with get_vars() 41445 1727204226.24496: variable 'ansible_search_path' from source: unknown 41445 1727204226.24513: we have included files to process 41445 1727204226.24515: generating all_blocks data 41445 1727204226.24517: done generating all_blocks data 41445 1727204226.24521: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 41445 1727204226.24522: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 41445 1727204226.24525: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 41445 1727204226.25000: done processing included file 41445 1727204226.25002: iterating over new_blocks loaded from include file 41445 1727204226.25003: in VariableManager get_vars() 41445 1727204226.25018: done with get_vars() 41445 1727204226.25020: filtering new block on tags 41445 1727204226.25035: done filtering new block on tags 41445 1727204226.25038: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed-node3 41445 1727204226.25043: extending task lists for all hosts with included blocks 41445 1727204226.25382: done extending task lists 41445 1727204226.25384: done processing included files 41445 1727204226.25385: results queue empty 41445 1727204226.25385: checking for any_errors_fatal 41445 1727204226.25389: done checking for any_errors_fatal 41445 1727204226.25390: checking for max_fail_percentage 41445 1727204226.25391: done checking for max_fail_percentage 41445 1727204226.25392: checking to see if all hosts have failed and the running result is not ok 41445 1727204226.25392: done checking to see if all hosts have failed 41445 1727204226.25393: getting the remaining hosts for this loop 41445 1727204226.25394: done getting the remaining hosts for this loop 41445 1727204226.25397: getting the next task for host managed-node3 41445 1727204226.25401: done getting next task for host managed-node3 41445 1727204226.25403: ^ task is: TASK: Check routes and DNS 41445 1727204226.25405: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204226.25407: getting variables 41445 1727204226.25408: in VariableManager get_vars() 41445 1727204226.25420: Calling all_inventory to load vars for managed-node3 41445 1727204226.25426: Calling groups_inventory to load vars for managed-node3 41445 1727204226.25428: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204226.25434: Calling all_plugins_play to load vars for managed-node3 41445 1727204226.25436: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204226.25439: Calling groups_plugins_play to load vars for managed-node3 41445 1727204226.26846: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204226.28942: done with get_vars() 41445 1727204226.28972: done getting variables 41445 1727204226.29145: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Tuesday 24 September 2024 14:57:06 -0400 (0:00:00.100) 0:00:45.079 ***** 41445 1727204226.29180: entering _queue_task() for managed-node3/shell 41445 1727204226.29800: worker is 1 (out of 1 available) 41445 1727204226.29811: exiting _queue_task() for managed-node3/shell 41445 1727204226.29821: done queuing things up, now waiting for results queue to drain 41445 1727204226.29822: waiting for pending results... 41445 1727204226.29987: running TaskExecutor() for managed-node3/TASK: Check routes and DNS 41445 1727204226.30018: in run() - task 028d2410-947f-bf02-eee4-0000000008e7 41445 1727204226.30033: variable 'ansible_search_path' from source: unknown 41445 1727204226.30036: variable 'ansible_search_path' from source: unknown 41445 1727204226.30083: calling self._execute() 41445 1727204226.30191: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204226.30194: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204226.30203: variable 'omit' from source: magic vars 41445 1727204226.30604: variable 'ansible_distribution_major_version' from source: facts 41445 1727204226.30618: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204226.30625: variable 'omit' from source: magic vars 41445 1727204226.30660: variable 'omit' from source: magic vars 41445 1727204226.30700: variable 'omit' from source: magic vars 41445 1727204226.30747: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41445 1727204226.30785: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41445 1727204226.30806: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41445 1727204226.30832: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204226.30842: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204226.30873: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41445 1727204226.30878: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204226.30881: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204226.31006: Set connection var ansible_shell_executable to /bin/sh 41445 1727204226.31012: Set connection var ansible_shell_type to sh 41445 1727204226.31016: Set connection var ansible_pipelining to False 41445 1727204226.31037: Set connection var ansible_timeout to 10 41445 1727204226.31040: Set connection var ansible_connection to ssh 41445 1727204226.31172: Set connection var ansible_module_compression to ZIP_DEFLATED 41445 1727204226.31177: variable 'ansible_shell_executable' from source: unknown 41445 1727204226.31180: variable 'ansible_connection' from source: unknown 41445 1727204226.31183: variable 'ansible_module_compression' from source: unknown 41445 1727204226.31186: variable 'ansible_shell_type' from source: unknown 41445 1727204226.31188: variable 'ansible_shell_executable' from source: unknown 41445 1727204226.31190: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204226.31437: variable 'ansible_pipelining' from source: unknown 41445 1727204226.31440: variable 'ansible_timeout' from source: unknown 41445 1727204226.31442: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204226.31445: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41445 1727204226.31448: variable 'omit' from source: magic vars 41445 1727204226.31449: starting attempt loop 41445 1727204226.31451: running the handler 41445 1727204226.31454: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41445 1727204226.31456: _low_level_execute_command(): starting 41445 1727204226.31458: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41445 1727204226.32777: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204226.32828: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204226.32842: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204226.32891: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204226.32957: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204226.34605: stdout chunk (state=3): >>>/root <<< 41445 1727204226.34738: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204226.34742: stdout chunk (state=3): >>><<< 41445 1727204226.34752: stderr chunk (state=3): >>><<< 41445 1727204226.34775: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204226.34884: _low_level_execute_command(): starting 41445 1727204226.34888: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204226.3477745-44399-203085241614891 `" && echo ansible-tmp-1727204226.3477745-44399-203085241614891="` echo /root/.ansible/tmp/ansible-tmp-1727204226.3477745-44399-203085241614891 `" ) && sleep 0' 41445 1727204226.35390: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204226.35400: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204226.35426: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204226.35430: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204226.35441: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 <<< 41445 1727204226.35444: stderr chunk (state=3): >>>debug2: match not found <<< 41445 1727204226.35481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204226.35485: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41445 1727204226.35487: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.22 is address <<< 41445 1727204226.35490: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41445 1727204226.35492: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204226.35494: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204226.35644: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204226.35648: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 <<< 41445 1727204226.35651: stderr chunk (state=3): >>>debug2: match found <<< 41445 1727204226.35653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204226.35655: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204226.35658: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204226.35774: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204226.35917: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204226.37685: stdout chunk (state=3): >>>ansible-tmp-1727204226.3477745-44399-203085241614891=/root/.ansible/tmp/ansible-tmp-1727204226.3477745-44399-203085241614891 <<< 41445 1727204226.37983: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204226.37986: stdout chunk (state=3): >>><<< 41445 1727204226.37988: stderr chunk (state=3): >>><<< 41445 1727204226.37991: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204226.3477745-44399-203085241614891=/root/.ansible/tmp/ansible-tmp-1727204226.3477745-44399-203085241614891 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204226.37993: variable 'ansible_module_compression' from source: unknown 41445 1727204226.37995: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-414450s0ylvj1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 41445 1727204226.37997: variable 'ansible_facts' from source: unknown 41445 1727204226.38061: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204226.3477745-44399-203085241614891/AnsiballZ_command.py 41445 1727204226.38271: Sending initial data 41445 1727204226.38274: Sent initial data (156 bytes) 41445 1727204226.38884: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204226.38887: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204226.38889: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204226.38897: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204226.38912: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204226.38921: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204226.39093: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204226.39147: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204226.40829: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 41445 1727204226.40833: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 41445 1727204226.40836: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 41445 1727204226.40838: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 41445 1727204226.40841: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 <<< 41445 1727204226.40843: stderr chunk (state=3): >>>debug2: Server supports extension "fsync@openssh.com" revision 1 <<< 41445 1727204226.40845: stderr chunk (state=3): >>>debug2: Server supports extension "lsetstat@openssh.com" revision 1 <<< 41445 1727204226.40847: stderr chunk (state=3): >>>debug2: Server supports extension "limits@openssh.com" revision 1 <<< 41445 1727204226.40902: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 41445 1727204226.40907: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 <<< 41445 1727204226.40909: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" <<< 41445 1727204226.40914: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41445 1727204226.40951: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41445 1727204226.41002: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-414450s0ylvj1/tmpkhbg0283 /root/.ansible/tmp/ansible-tmp-1727204226.3477745-44399-203085241614891/AnsiballZ_command.py <<< 41445 1727204226.41045: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204226.3477745-44399-203085241614891/AnsiballZ_command.py" <<< 41445 1727204226.41070: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 41445 1727204226.41094: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-414450s0ylvj1/tmpkhbg0283" to remote "/root/.ansible/tmp/ansible-tmp-1727204226.3477745-44399-203085241614891/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204226.3477745-44399-203085241614891/AnsiballZ_command.py" <<< 41445 1727204226.41819: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204226.41889: stderr chunk (state=3): >>><<< 41445 1727204226.41892: stdout chunk (state=3): >>><<< 41445 1727204226.41894: done transferring module to remote 41445 1727204226.41901: _low_level_execute_command(): starting 41445 1727204226.41907: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204226.3477745-44399-203085241614891/ /root/.ansible/tmp/ansible-tmp-1727204226.3477745-44399-203085241614891/AnsiballZ_command.py && sleep 0' 41445 1727204226.42458: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204226.42468: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204226.42486: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204226.42497: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204226.42507: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 <<< 41445 1727204226.42516: stderr chunk (state=3): >>>debug2: match not found <<< 41445 1727204226.42524: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204226.42593: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41445 1727204226.42600: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.22 is address <<< 41445 1727204226.42603: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41445 1727204226.42605: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204226.42613: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204226.42616: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204226.42669: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204226.42807: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204226.42928: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204226.44520: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204226.44550: stderr chunk (state=3): >>><<< 41445 1727204226.44554: stdout chunk (state=3): >>><<< 41445 1727204226.44573: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204226.44579: _low_level_execute_command(): starting 41445 1727204226.44582: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204226.3477745-44399-203085241614891/AnsiballZ_command.py && sleep 0' 41445 1727204226.45012: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204226.45016: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204226.45018: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration <<< 41445 1727204226.45021: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204226.45023: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204226.45062: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204226.45066: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204226.45109: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204226.60831: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 02:83:38:1a:ae:4d brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.47.22/22 brd 10.31.47.255 scope global dynamic noprefixroute eth0\n valid_lft 2826sec preferred_lft 2826sec\n inet6 fe80::83:38ff:fe1a:ae4d/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\n39: rpltstbr: mtu 1500 qdisc noqueue state DOWN group default qlen 1000\n link/ether 26:cf:9a:9b:f7:ee brd ff:ff:ff:ff:ff:ff\n inet 192.0.2.72/31 scope global noprefixroute rpltstbr\n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.44.1 dev eth0 proto dhcp src 10.31.47.22 metric 100 \n10.31.44.0/22 dev eth0 proto kernel scope link src 10.31.47.22 metric 100 \n192.0.2.72/31 dev rpltstbr proto kernel scope link src 192.0.2.72 metric 425 linkdown \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 14:57:06.598656", "end": "2024-09-24 14:57:06.606857", "delta": "0:00:00.008201", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 41445 1727204226.62153: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. <<< 41445 1727204226.62182: stderr chunk (state=3): >>><<< 41445 1727204226.62188: stdout chunk (state=3): >>><<< 41445 1727204226.62214: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 02:83:38:1a:ae:4d brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.47.22/22 brd 10.31.47.255 scope global dynamic noprefixroute eth0\n valid_lft 2826sec preferred_lft 2826sec\n inet6 fe80::83:38ff:fe1a:ae4d/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\n39: rpltstbr: mtu 1500 qdisc noqueue state DOWN group default qlen 1000\n link/ether 26:cf:9a:9b:f7:ee brd ff:ff:ff:ff:ff:ff\n inet 192.0.2.72/31 scope global noprefixroute rpltstbr\n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.44.1 dev eth0 proto dhcp src 10.31.47.22 metric 100 \n10.31.44.0/22 dev eth0 proto kernel scope link src 10.31.47.22 metric 100 \n192.0.2.72/31 dev rpltstbr proto kernel scope link src 192.0.2.72 metric 425 linkdown \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 14:57:06.598656", "end": "2024-09-24 14:57:06.606857", "delta": "0:00:00.008201", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. 41445 1727204226.62269: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204226.3477745-44399-203085241614891/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41445 1727204226.62272: _low_level_execute_command(): starting 41445 1727204226.62280: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204226.3477745-44399-203085241614891/ > /dev/null 2>&1 && sleep 0' 41445 1727204226.62988: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204226.63012: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204226.63035: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204226.63100: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204226.64836: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204226.64858: stderr chunk (state=3): >>><<< 41445 1727204226.64861: stdout chunk (state=3): >>><<< 41445 1727204226.64881: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204226.64884: handler run complete 41445 1727204226.64905: Evaluated conditional (False): False 41445 1727204226.64915: attempt loop complete, returning result 41445 1727204226.64923: _execute() done 41445 1727204226.64925: dumping result to json 41445 1727204226.64927: done dumping result, returning 41445 1727204226.64933: done running TaskExecutor() for managed-node3/TASK: Check routes and DNS [028d2410-947f-bf02-eee4-0000000008e7] 41445 1727204226.64938: sending task result for task 028d2410-947f-bf02-eee4-0000000008e7 41445 1727204226.65042: done sending task result for task 028d2410-947f-bf02-eee4-0000000008e7 41445 1727204226.65044: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.008201", "end": "2024-09-24 14:57:06.606857", "rc": 0, "start": "2024-09-24 14:57:06.598656" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 02:83:38:1a:ae:4d brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.47.22/22 brd 10.31.47.255 scope global dynamic noprefixroute eth0 valid_lft 2826sec preferred_lft 2826sec inet6 fe80::83:38ff:fe1a:ae4d/64 scope link noprefixroute valid_lft forever preferred_lft forever 39: rpltstbr: mtu 1500 qdisc noqueue state DOWN group default qlen 1000 link/ether 26:cf:9a:9b:f7:ee brd ff:ff:ff:ff:ff:ff inet 192.0.2.72/31 scope global noprefixroute rpltstbr valid_lft forever preferred_lft forever IP ROUTE default via 10.31.44.1 dev eth0 proto dhcp src 10.31.47.22 metric 100 10.31.44.0/22 dev eth0 proto kernel scope link src 10.31.47.22 metric 100 192.0.2.72/31 dev rpltstbr proto kernel scope link src 192.0.2.72 metric 425 linkdown IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 41445 1727204226.65113: no more pending results, returning what we have 41445 1727204226.65117: results queue empty 41445 1727204226.65118: checking for any_errors_fatal 41445 1727204226.65119: done checking for any_errors_fatal 41445 1727204226.65120: checking for max_fail_percentage 41445 1727204226.65121: done checking for max_fail_percentage 41445 1727204226.65122: checking to see if all hosts have failed and the running result is not ok 41445 1727204226.65123: done checking to see if all hosts have failed 41445 1727204226.65124: getting the remaining hosts for this loop 41445 1727204226.65125: done getting the remaining hosts for this loop 41445 1727204226.65129: getting the next task for host managed-node3 41445 1727204226.65136: done getting next task for host managed-node3 41445 1727204226.65138: ^ task is: TASK: Verify DNS and network connectivity 41445 1727204226.65141: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204226.65144: getting variables 41445 1727204226.65146: in VariableManager get_vars() 41445 1727204226.65175: Calling all_inventory to load vars for managed-node3 41445 1727204226.65187: Calling groups_inventory to load vars for managed-node3 41445 1727204226.65191: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204226.65202: Calling all_plugins_play to load vars for managed-node3 41445 1727204226.65205: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204226.65207: Calling groups_plugins_play to load vars for managed-node3 41445 1727204226.66652: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204226.68369: done with get_vars() 41445 1727204226.68396: done getting variables 41445 1727204226.68460: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Tuesday 24 September 2024 14:57:06 -0400 (0:00:00.393) 0:00:45.472 ***** 41445 1727204226.68493: entering _queue_task() for managed-node3/shell 41445 1727204226.69289: worker is 1 (out of 1 available) 41445 1727204226.69301: exiting _queue_task() for managed-node3/shell 41445 1727204226.69313: done queuing things up, now waiting for results queue to drain 41445 1727204226.69314: waiting for pending results... 41445 1727204226.69725: running TaskExecutor() for managed-node3/TASK: Verify DNS and network connectivity 41445 1727204226.69730: in run() - task 028d2410-947f-bf02-eee4-0000000008e8 41445 1727204226.69902: variable 'ansible_search_path' from source: unknown 41445 1727204226.69955: variable 'ansible_search_path' from source: unknown 41445 1727204226.69960: calling self._execute() 41445 1727204226.70068: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204226.70072: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204226.70084: variable 'omit' from source: magic vars 41445 1727204226.70990: variable 'ansible_distribution_major_version' from source: facts 41445 1727204226.71003: Evaluated conditional (ansible_distribution_major_version != '6'): True 41445 1727204226.71165: variable 'ansible_facts' from source: unknown 41445 1727204226.73128: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 41445 1727204226.73135: variable 'omit' from source: magic vars 41445 1727204226.73217: variable 'omit' from source: magic vars 41445 1727204226.73256: variable 'omit' from source: magic vars 41445 1727204226.73373: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41445 1727204226.73421: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41445 1727204226.73445: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41445 1727204226.73470: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204226.73551: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41445 1727204226.73642: variable 'inventory_hostname' from source: host vars for 'managed-node3' 41445 1727204226.73645: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204226.73648: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204226.73758: Set connection var ansible_shell_executable to /bin/sh 41445 1727204226.73761: Set connection var ansible_shell_type to sh 41445 1727204226.73766: Set connection var ansible_pipelining to False 41445 1727204226.73774: Set connection var ansible_timeout to 10 41445 1727204226.73778: Set connection var ansible_connection to ssh 41445 1727204226.73786: Set connection var ansible_module_compression to ZIP_DEFLATED 41445 1727204226.73816: variable 'ansible_shell_executable' from source: unknown 41445 1727204226.73820: variable 'ansible_connection' from source: unknown 41445 1727204226.73828: variable 'ansible_module_compression' from source: unknown 41445 1727204226.73831: variable 'ansible_shell_type' from source: unknown 41445 1727204226.73833: variable 'ansible_shell_executable' from source: unknown 41445 1727204226.73837: variable 'ansible_host' from source: host vars for 'managed-node3' 41445 1727204226.73842: variable 'ansible_pipelining' from source: unknown 41445 1727204226.73845: variable 'ansible_timeout' from source: unknown 41445 1727204226.73917: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 41445 1727204226.74161: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41445 1727204226.74171: variable 'omit' from source: magic vars 41445 1727204226.74177: starting attempt loop 41445 1727204226.74180: running the handler 41445 1727204226.74190: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 41445 1727204226.74209: _low_level_execute_command(): starting 41445 1727204226.74218: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41445 1727204226.75059: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204226.75140: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204226.75173: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204226.75207: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204226.76788: stdout chunk (state=3): >>>/root <<< 41445 1727204226.76972: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204226.76977: stdout chunk (state=3): >>><<< 41445 1727204226.76980: stderr chunk (state=3): >>><<< 41445 1727204226.77133: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204226.77137: _low_level_execute_command(): starting 41445 1727204226.77140: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204226.770115-44417-8222253771289 `" && echo ansible-tmp-1727204226.770115-44417-8222253771289="` echo /root/.ansible/tmp/ansible-tmp-1727204226.770115-44417-8222253771289 `" ) && sleep 0' 41445 1727204226.77827: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204226.77844: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41445 1727204226.77890: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204226.77934: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204226.77961: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204226.78093: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204226.78106: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204226.79934: stdout chunk (state=3): >>>ansible-tmp-1727204226.770115-44417-8222253771289=/root/.ansible/tmp/ansible-tmp-1727204226.770115-44417-8222253771289 <<< 41445 1727204226.80082: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204226.80091: stdout chunk (state=3): >>><<< 41445 1727204226.80105: stderr chunk (state=3): >>><<< 41445 1727204226.80194: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204226.770115-44417-8222253771289=/root/.ansible/tmp/ansible-tmp-1727204226.770115-44417-8222253771289 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204226.80197: variable 'ansible_module_compression' from source: unknown 41445 1727204226.80343: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-414450s0ylvj1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 41445 1727204226.80421: variable 'ansible_facts' from source: unknown 41445 1727204226.80507: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204226.770115-44417-8222253771289/AnsiballZ_command.py 41445 1727204226.80752: Sending initial data 41445 1727204226.80755: Sent initial data (153 bytes) 41445 1727204226.81399: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204226.81429: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41445 1727204226.81506: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204226.81597: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204226.81616: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204226.81668: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204226.81717: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204226.83364: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41445 1727204226.83409: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41445 1727204226.83511: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-414450s0ylvj1/tmp0ovx0wpg /root/.ansible/tmp/ansible-tmp-1727204226.770115-44417-8222253771289/AnsiballZ_command.py <<< 41445 1727204226.83515: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204226.770115-44417-8222253771289/AnsiballZ_command.py" <<< 41445 1727204226.83567: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-414450s0ylvj1/tmp0ovx0wpg" to remote "/root/.ansible/tmp/ansible-tmp-1727204226.770115-44417-8222253771289/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204226.770115-44417-8222253771289/AnsiballZ_command.py" <<< 41445 1727204226.84559: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204226.84635: stderr chunk (state=3): >>><<< 41445 1727204226.84670: stdout chunk (state=3): >>><<< 41445 1727204226.84753: done transferring module to remote 41445 1727204226.84756: _low_level_execute_command(): starting 41445 1727204226.84759: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204226.770115-44417-8222253771289/ /root/.ansible/tmp/ansible-tmp-1727204226.770115-44417-8222253771289/AnsiballZ_command.py && sleep 0' 41445 1727204226.85485: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204226.85561: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204226.85701: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204226.85779: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204226.85814: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204226.85962: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204226.87792: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204226.87796: stdout chunk (state=3): >>><<< 41445 1727204226.87799: stderr chunk (state=3): >>><<< 41445 1727204226.87889: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204226.87894: _low_level_execute_command(): starting 41445 1727204226.87897: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204226.770115-44417-8222253771289/AnsiballZ_command.py && sleep 0' 41445 1727204226.89133: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41445 1727204226.89251: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41445 1727204226.89272: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204226.89287: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' <<< 41445 1727204226.89304: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204226.89327: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204226.89406: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204227.37914: stdout chunk (state=3): >>> {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 1465 0 --:--:-- --:--:-- --:--:-- 1466\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 2769 0 --:--:-- --:--:-- --:--:-- 2771", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-24 14:57:07.041323", "end": "2024-09-24 14:57:07.377885", "delta": "0:00:00.336562", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 41445 1727204227.39510: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. <<< 41445 1727204227.39540: stderr chunk (state=3): >>><<< 41445 1727204227.39543: stdout chunk (state=3): >>><<< 41445 1727204227.39568: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 1465 0 --:--:-- --:--:-- --:--:-- 1466\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 2769 0 --:--:-- --:--:-- --:--:-- 2771", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-24 14:57:07.041323", "end": "2024-09-24 14:57:07.377885", "delta": "0:00:00.336562", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.22 closed. 41445 1727204227.39606: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204226.770115-44417-8222253771289/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41445 1727204227.39616: _low_level_execute_command(): starting 41445 1727204227.39619: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204226.770115-44417-8222253771289/ > /dev/null 2>&1 && sleep 0' 41445 1727204227.40145: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41445 1727204227.40148: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41445 1727204227.40228: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK <<< 41445 1727204227.40231: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41445 1727204227.40287: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41445 1727204227.42117: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41445 1727204227.42157: stderr chunk (state=3): >>><<< 41445 1727204227.42161: stdout chunk (state=3): >>><<< 41445 1727204227.42173: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.22 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.22 originally 10.31.47.22 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2833a247f6' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41445 1727204227.42184: handler run complete 41445 1727204227.42203: Evaluated conditional (False): False 41445 1727204227.42209: attempt loop complete, returning result 41445 1727204227.42211: _execute() done 41445 1727204227.42217: dumping result to json 41445 1727204227.42223: done dumping result, returning 41445 1727204227.42231: done running TaskExecutor() for managed-node3/TASK: Verify DNS and network connectivity [028d2410-947f-bf02-eee4-0000000008e8] 41445 1727204227.42235: sending task result for task 028d2410-947f-bf02-eee4-0000000008e8 41445 1727204227.42334: done sending task result for task 028d2410-947f-bf02-eee4-0000000008e8 41445 1727204227.42337: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.336562", "end": "2024-09-24 14:57:07.377885", "rc": 0, "start": "2024-09-24 14:57:07.041323" } STDOUT: CHECK DNS AND CONNECTIVITY 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 1465 0 --:--:-- --:--:-- --:--:-- 1466 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 2769 0 --:--:-- --:--:-- --:--:-- 2771 41445 1727204227.42406: no more pending results, returning what we have 41445 1727204227.42410: results queue empty 41445 1727204227.42410: checking for any_errors_fatal 41445 1727204227.42420: done checking for any_errors_fatal 41445 1727204227.42421: checking for max_fail_percentage 41445 1727204227.42423: done checking for max_fail_percentage 41445 1727204227.42424: checking to see if all hosts have failed and the running result is not ok 41445 1727204227.42425: done checking to see if all hosts have failed 41445 1727204227.42425: getting the remaining hosts for this loop 41445 1727204227.42427: done getting the remaining hosts for this loop 41445 1727204227.42430: getting the next task for host managed-node3 41445 1727204227.42438: done getting next task for host managed-node3 41445 1727204227.42440: ^ task is: TASK: meta (flush_handlers) 41445 1727204227.42446: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204227.42449: getting variables 41445 1727204227.42451: in VariableManager get_vars() 41445 1727204227.42482: Calling all_inventory to load vars for managed-node3 41445 1727204227.42487: Calling groups_inventory to load vars for managed-node3 41445 1727204227.42491: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204227.42501: Calling all_plugins_play to load vars for managed-node3 41445 1727204227.42505: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204227.42508: Calling groups_plugins_play to load vars for managed-node3 41445 1727204227.43468: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204227.44368: done with get_vars() 41445 1727204227.44389: done getting variables 41445 1727204227.44443: in VariableManager get_vars() 41445 1727204227.44452: Calling all_inventory to load vars for managed-node3 41445 1727204227.44454: Calling groups_inventory to load vars for managed-node3 41445 1727204227.44456: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204227.44459: Calling all_plugins_play to load vars for managed-node3 41445 1727204227.44461: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204227.44462: Calling groups_plugins_play to load vars for managed-node3 41445 1727204227.45335: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204227.46632: done with get_vars() 41445 1727204227.46651: done queuing things up, now waiting for results queue to drain 41445 1727204227.46653: results queue empty 41445 1727204227.46654: checking for any_errors_fatal 41445 1727204227.46656: done checking for any_errors_fatal 41445 1727204227.46656: checking for max_fail_percentage 41445 1727204227.46657: done checking for max_fail_percentage 41445 1727204227.46658: checking to see if all hosts have failed and the running result is not ok 41445 1727204227.46658: done checking to see if all hosts have failed 41445 1727204227.46659: getting the remaining hosts for this loop 41445 1727204227.46659: done getting the remaining hosts for this loop 41445 1727204227.46662: getting the next task for host managed-node3 41445 1727204227.46664: done getting next task for host managed-node3 41445 1727204227.46665: ^ task is: TASK: meta (flush_handlers) 41445 1727204227.46666: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204227.46668: getting variables 41445 1727204227.46669: in VariableManager get_vars() 41445 1727204227.46674: Calling all_inventory to load vars for managed-node3 41445 1727204227.46677: Calling groups_inventory to load vars for managed-node3 41445 1727204227.46679: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204227.46683: Calling all_plugins_play to load vars for managed-node3 41445 1727204227.46685: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204227.46686: Calling groups_plugins_play to load vars for managed-node3 41445 1727204227.47358: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204227.48262: done with get_vars() 41445 1727204227.48282: done getting variables 41445 1727204227.48322: in VariableManager get_vars() 41445 1727204227.48329: Calling all_inventory to load vars for managed-node3 41445 1727204227.48331: Calling groups_inventory to load vars for managed-node3 41445 1727204227.48332: Calling all_plugins_inventory to load vars for managed-node3 41445 1727204227.48336: Calling all_plugins_play to load vars for managed-node3 41445 1727204227.48337: Calling groups_plugins_inventory to load vars for managed-node3 41445 1727204227.48340: Calling groups_plugins_play to load vars for managed-node3 41445 1727204227.49079: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41445 1727204227.49985: done with get_vars() 41445 1727204227.50004: done queuing things up, now waiting for results queue to drain 41445 1727204227.50006: results queue empty 41445 1727204227.50007: checking for any_errors_fatal 41445 1727204227.50007: done checking for any_errors_fatal 41445 1727204227.50008: checking for max_fail_percentage 41445 1727204227.50009: done checking for max_fail_percentage 41445 1727204227.50009: checking to see if all hosts have failed and the running result is not ok 41445 1727204227.50009: done checking to see if all hosts have failed 41445 1727204227.50010: getting the remaining hosts for this loop 41445 1727204227.50013: done getting the remaining hosts for this loop 41445 1727204227.50016: getting the next task for host managed-node3 41445 1727204227.50018: done getting next task for host managed-node3 41445 1727204227.50018: ^ task is: None 41445 1727204227.50019: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41445 1727204227.50020: done queuing things up, now waiting for results queue to drain 41445 1727204227.50021: results queue empty 41445 1727204227.50021: checking for any_errors_fatal 41445 1727204227.50022: done checking for any_errors_fatal 41445 1727204227.50022: checking for max_fail_percentage 41445 1727204227.50023: done checking for max_fail_percentage 41445 1727204227.50023: checking to see if all hosts have failed and the running result is not ok 41445 1727204227.50024: done checking to see if all hosts have failed 41445 1727204227.50024: getting the next task for host managed-node3 41445 1727204227.50026: done getting next task for host managed-node3 41445 1727204227.50026: ^ task is: None 41445 1727204227.50027: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed-node3 : ok=90 changed=6 unreachable=0 failed=0 skipped=91 rescued=0 ignored=1 Tuesday 24 September 2024 14:57:07 -0400 (0:00:00.815) 0:00:46.288 ***** =============================================================================== fedora.linux_system_roles.network : Check which services are running ---- 1.93s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.78s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.77s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.74s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Gathering Facts --------------------------------------------------------- 1.39s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tests_route_table_nm.yml:6 Gathering Facts --------------------------------------------------------- 1.24s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:149 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 1.19s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Gathering Facts --------------------------------------------------------- 1.17s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 Gathering Facts --------------------------------------------------------- 1.13s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_table.yml:3 Gathering Facts --------------------------------------------------------- 1.09s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:5 Create veth interface ethtest0 ------------------------------------------ 1.07s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.98s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Gathering Facts --------------------------------------------------------- 0.96s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 fedora.linux_system_roles.network : Check which packages are installed --- 0.91s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.86s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Check which packages are installed --- 0.84s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Verify DNS and network connectivity ------------------------------------- 0.82s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Install iproute --------------------------------------------------------- 0.81s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 fedora.linux_system_roles.network : Re-test connectivity ---------------- 0.78s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Gather the minimum subset of ansible_facts required by the network role test --- 0.76s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 41445 1727204227.50122: RUNNING CLEANUP