[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 18699 1726882325.66301: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-spT executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 18699 1726882325.66773: Added group all to inventory 18699 1726882325.66775: Added group ungrouped to inventory 18699 1726882325.66779: Group all now contains ungrouped 18699 1726882325.66782: Examining possible inventory source: /tmp/network-Kc3/inventory.yml 18699 1726882325.86536: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 18699 1726882325.86597: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 18699 1726882325.86735: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 18699 1726882325.86897: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 18699 1726882325.86978: Loaded config def from plugin (inventory/script) 18699 1726882325.86980: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 18699 1726882325.87023: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 18699 1726882325.87294: Loaded config def from plugin (inventory/yaml) 18699 1726882325.87297: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 18699 1726882325.87499: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 18699 1726882325.88374: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 18699 1726882325.88377: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 18699 1726882325.88380: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 18699 1726882325.88386: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 18699 1726882325.88390: Loading data from /tmp/network-Kc3/inventory.yml 18699 1726882325.88525: /tmp/network-Kc3/inventory.yml was not parsable by auto 18699 1726882325.88710: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 18699 1726882325.88749: Loading data from /tmp/network-Kc3/inventory.yml 18699 1726882325.88951: group all already in inventory 18699 1726882325.88958: set inventory_file for managed_node1 18699 1726882325.88962: set inventory_dir for managed_node1 18699 1726882325.88963: Added host managed_node1 to inventory 18699 1726882325.88965: Added host managed_node1 to group all 18699 1726882325.88966: set ansible_host for managed_node1 18699 1726882325.88967: set ansible_ssh_extra_args for managed_node1 18699 1726882325.88970: set inventory_file for managed_node2 18699 1726882325.88972: set inventory_dir for managed_node2 18699 1726882325.88973: Added host managed_node2 to inventory 18699 1726882325.88974: Added host managed_node2 to group all 18699 1726882325.88975: set ansible_host for managed_node2 18699 1726882325.88976: set ansible_ssh_extra_args for managed_node2 18699 1726882325.88978: set inventory_file for managed_node3 18699 1726882325.88980: set inventory_dir for managed_node3 18699 1726882325.88981: Added host managed_node3 to inventory 18699 1726882325.88982: Added host managed_node3 to group all 18699 1726882325.88983: set ansible_host for managed_node3 18699 1726882325.88984: set ansible_ssh_extra_args for managed_node3 18699 1726882325.88986: Reconcile groups and hosts in inventory. 18699 1726882325.88990: Group ungrouped now contains managed_node1 18699 1726882325.88991: Group ungrouped now contains managed_node2 18699 1726882325.88998: Group ungrouped now contains managed_node3 18699 1726882325.89187: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 18699 1726882325.89410: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 18699 1726882325.89465: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 18699 1726882325.89492: Loaded config def from plugin (vars/host_group_vars) 18699 1726882325.89496: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 18699 1726882325.89503: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 18699 1726882325.89511: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 18699 1726882325.89561: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 18699 1726882325.89918: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882325.90020: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 18699 1726882325.90061: Loaded config def from plugin (connection/local) 18699 1726882325.90065: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 18699 1726882325.90803: Loaded config def from plugin (connection/paramiko_ssh) 18699 1726882325.90807: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 18699 1726882325.92125: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 18699 1726882325.92233: Loaded config def from plugin (connection/psrp) 18699 1726882325.92349: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 18699 1726882325.93990: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 18699 1726882325.94032: Loaded config def from plugin (connection/ssh) 18699 1726882325.94035: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 18699 1726882325.98361: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 18699 1726882325.98455: Loaded config def from plugin (connection/winrm) 18699 1726882325.98458: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 18699 1726882325.98490: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 18699 1726882325.98556: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 18699 1726882325.98839: Loaded config def from plugin (shell/cmd) 18699 1726882325.98841: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 18699 1726882325.98908: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 18699 1726882325.98978: Loaded config def from plugin (shell/powershell) 18699 1726882325.98981: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 18699 1726882325.99149: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 18699 1726882325.99390: Loaded config def from plugin (shell/sh) 18699 1726882325.99399: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 18699 1726882325.99433: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 18699 1726882325.99558: Loaded config def from plugin (become/runas) 18699 1726882325.99560: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 18699 1726882325.99795: Loaded config def from plugin (become/su) 18699 1726882325.99797: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 18699 1726882325.99964: Loaded config def from plugin (become/sudo) 18699 1726882325.99967: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 18699 1726882326.00001: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_nm.yml 18699 1726882326.00432: in VariableManager get_vars() 18699 1726882326.00467: done with get_vars() 18699 1726882326.00609: trying /usr/local/lib/python3.12/site-packages/ansible/modules 18699 1726882326.08207: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 18699 1726882326.08729: in VariableManager get_vars() 18699 1726882326.08734: done with get_vars() 18699 1726882326.08737: variable 'playbook_dir' from source: magic vars 18699 1726882326.08738: variable 'ansible_playbook_python' from source: magic vars 18699 1726882326.08739: variable 'ansible_config_file' from source: magic vars 18699 1726882326.08739: variable 'groups' from source: magic vars 18699 1726882326.08740: variable 'omit' from source: magic vars 18699 1726882326.08741: variable 'ansible_version' from source: magic vars 18699 1726882326.08741: variable 'ansible_check_mode' from source: magic vars 18699 1726882326.08742: variable 'ansible_diff_mode' from source: magic vars 18699 1726882326.08743: variable 'ansible_forks' from source: magic vars 18699 1726882326.08744: variable 'ansible_inventory_sources' from source: magic vars 18699 1726882326.08744: variable 'ansible_skip_tags' from source: magic vars 18699 1726882326.08745: variable 'ansible_limit' from source: magic vars 18699 1726882326.08746: variable 'ansible_run_tags' from source: magic vars 18699 1726882326.08746: variable 'ansible_verbosity' from source: magic vars 18699 1726882326.08781: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml 18699 1726882326.10670: in VariableManager get_vars() 18699 1726882326.10688: done with get_vars() 18699 1726882326.11030: in VariableManager get_vars() 18699 1726882326.11054: done with get_vars() 18699 1726882326.11095: in VariableManager get_vars() 18699 1726882326.11108: done with get_vars() 18699 1726882326.11162: in VariableManager get_vars() 18699 1726882326.11182: done with get_vars() 18699 1726882326.11380: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 18699 1726882326.11820: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 18699 1726882326.12122: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 18699 1726882326.13548: in VariableManager get_vars() 18699 1726882326.13569: done with get_vars() 18699 1726882326.14502: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 18699 1726882326.14878: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 18699 1726882326.17578: in VariableManager get_vars() 18699 1726882326.17602: done with get_vars() 18699 1726882326.17816: in VariableManager get_vars() 18699 1726882326.17821: done with get_vars() 18699 1726882326.17823: variable 'playbook_dir' from source: magic vars 18699 1726882326.17824: variable 'ansible_playbook_python' from source: magic vars 18699 1726882326.17825: variable 'ansible_config_file' from source: magic vars 18699 1726882326.17826: variable 'groups' from source: magic vars 18699 1726882326.17826: variable 'omit' from source: magic vars 18699 1726882326.17827: variable 'ansible_version' from source: magic vars 18699 1726882326.17828: variable 'ansible_check_mode' from source: magic vars 18699 1726882326.17829: variable 'ansible_diff_mode' from source: magic vars 18699 1726882326.17829: variable 'ansible_forks' from source: magic vars 18699 1726882326.17830: variable 'ansible_inventory_sources' from source: magic vars 18699 1726882326.17831: variable 'ansible_skip_tags' from source: magic vars 18699 1726882326.17832: variable 'ansible_limit' from source: magic vars 18699 1726882326.17832: variable 'ansible_run_tags' from source: magic vars 18699 1726882326.17833: variable 'ansible_verbosity' from source: magic vars 18699 1726882326.17979: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml 18699 1726882326.18053: in VariableManager get_vars() 18699 1726882326.18171: done with get_vars() 18699 1726882326.18174: variable 'playbook_dir' from source: magic vars 18699 1726882326.18175: variable 'ansible_playbook_python' from source: magic vars 18699 1726882326.18176: variable 'ansible_config_file' from source: magic vars 18699 1726882326.18176: variable 'groups' from source: magic vars 18699 1726882326.18177: variable 'omit' from source: magic vars 18699 1726882326.18178: variable 'ansible_version' from source: magic vars 18699 1726882326.18179: variable 'ansible_check_mode' from source: magic vars 18699 1726882326.18179: variable 'ansible_diff_mode' from source: magic vars 18699 1726882326.18180: variable 'ansible_forks' from source: magic vars 18699 1726882326.18181: variable 'ansible_inventory_sources' from source: magic vars 18699 1726882326.18181: variable 'ansible_skip_tags' from source: magic vars 18699 1726882326.18182: variable 'ansible_limit' from source: magic vars 18699 1726882326.18183: variable 'ansible_run_tags' from source: magic vars 18699 1726882326.18183: variable 'ansible_verbosity' from source: magic vars 18699 1726882326.18215: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml 18699 1726882326.18412: in VariableManager get_vars() 18699 1726882326.18424: done with get_vars() 18699 1726882326.18463: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 18699 1726882326.18681: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 18699 1726882326.18883: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 18699 1726882326.19958: in VariableManager get_vars() 18699 1726882326.20022: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 18699 1726882326.24660: in VariableManager get_vars() 18699 1726882326.24684: done with get_vars() 18699 1726882326.24729: in VariableManager get_vars() 18699 1726882326.24733: done with get_vars() 18699 1726882326.24735: variable 'playbook_dir' from source: magic vars 18699 1726882326.24736: variable 'ansible_playbook_python' from source: magic vars 18699 1726882326.24737: variable 'ansible_config_file' from source: magic vars 18699 1726882326.24738: variable 'groups' from source: magic vars 18699 1726882326.24739: variable 'omit' from source: magic vars 18699 1726882326.24739: variable 'ansible_version' from source: magic vars 18699 1726882326.24740: variable 'ansible_check_mode' from source: magic vars 18699 1726882326.24741: variable 'ansible_diff_mode' from source: magic vars 18699 1726882326.24742: variable 'ansible_forks' from source: magic vars 18699 1726882326.24742: variable 'ansible_inventory_sources' from source: magic vars 18699 1726882326.24743: variable 'ansible_skip_tags' from source: magic vars 18699 1726882326.24744: variable 'ansible_limit' from source: magic vars 18699 1726882326.24745: variable 'ansible_run_tags' from source: magic vars 18699 1726882326.24746: variable 'ansible_verbosity' from source: magic vars 18699 1726882326.24781: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml 18699 1726882326.25254: in VariableManager get_vars() 18699 1726882326.25266: done with get_vars() 18699 1726882326.25306: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 18699 1726882326.29683: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 18699 1726882326.30050: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 18699 1726882326.31135: in VariableManager get_vars() 18699 1726882326.31157: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 18699 1726882326.34722: in VariableManager get_vars() 18699 1726882326.34740: done with get_vars() 18699 1726882326.34830: in VariableManager get_vars() 18699 1726882326.34844: done with get_vars() 18699 1726882326.35023: in VariableManager get_vars() 18699 1726882326.35036: done with get_vars() 18699 1726882326.35251: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 18699 1726882326.35266: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 18699 1726882326.35862: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 18699 1726882326.36192: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 18699 1726882326.36196: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-spT/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) 18699 1726882326.36229: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 18699 1726882326.36255: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 18699 1726882326.36951: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 18699 1726882326.37005: Loaded config def from plugin (callback/default) 18699 1726882326.37008: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 18699 1726882326.39781: Loaded config def from plugin (callback/junit) 18699 1726882326.39785: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 18699 1726882326.39953: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 18699 1726882326.40122: Loaded config def from plugin (callback/minimal) 18699 1726882326.40125: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 18699 1726882326.40171: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 18699 1726882326.40232: Loaded config def from plugin (callback/tree) 18699 1726882326.40234: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 18699 1726882326.40538: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 18699 1726882326.40541: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-spT/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_ethernet_nm.yml ************************************************ 10 plays in /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_nm.yml 18699 1726882326.41004: in VariableManager get_vars() 18699 1726882326.41020: done with get_vars() 18699 1726882326.41026: in VariableManager get_vars() 18699 1726882326.41034: done with get_vars() 18699 1726882326.41039: variable 'omit' from source: magic vars 18699 1726882326.41079: in VariableManager get_vars() 18699 1726882326.41092: done with get_vars() 18699 1726882326.41318: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_ethernet.yml' with nm as provider] ********* 18699 1726882326.42253: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 18699 1726882326.42531: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 18699 1726882326.42565: getting the remaining hosts for this loop 18699 1726882326.42567: done getting the remaining hosts for this loop 18699 1726882326.42570: getting the next task for host managed_node1 18699 1726882326.42574: done getting next task for host managed_node1 18699 1726882326.42576: ^ task is: TASK: Gathering Facts 18699 1726882326.42577: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882326.42585: getting variables 18699 1726882326.42586: in VariableManager get_vars() 18699 1726882326.42799: Calling all_inventory to load vars for managed_node1 18699 1726882326.42802: Calling groups_inventory to load vars for managed_node1 18699 1726882326.42805: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882326.42817: Calling all_plugins_play to load vars for managed_node1 18699 1726882326.42828: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882326.42832: Calling groups_plugins_play to load vars for managed_node1 18699 1726882326.42865: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882326.42921: done with get_vars() 18699 1726882326.42928: done getting variables 18699 1726882326.43194: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_nm.yml:6 Friday 20 September 2024 21:32:06 -0400 (0:00:00.028) 0:00:00.028 ****** 18699 1726882326.43217: entering _queue_task() for managed_node1/gather_facts 18699 1726882326.43218: Creating lock for gather_facts 18699 1726882326.43780: worker is 1 (out of 1 available) 18699 1726882326.43791: exiting _queue_task() for managed_node1/gather_facts 18699 1726882326.44006: done queuing things up, now waiting for results queue to drain 18699 1726882326.44008: waiting for pending results... 18699 1726882326.44357: running TaskExecutor() for managed_node1/TASK: Gathering Facts 18699 1726882326.44474: in run() - task 12673a56-9f93-1ce6-d207-00000000007c 18699 1726882326.44478: variable 'ansible_search_path' from source: unknown 18699 1726882326.44784: calling self._execute() 18699 1726882326.44788: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882326.44790: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882326.44794: variable 'omit' from source: magic vars 18699 1726882326.45190: variable 'omit' from source: magic vars 18699 1726882326.45326: variable 'omit' from source: magic vars 18699 1726882326.45474: variable 'omit' from source: magic vars 18699 1726882326.45679: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18699 1726882326.45683: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18699 1726882326.45717: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18699 1726882326.45739: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882326.45806: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882326.45844: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18699 1726882326.45907: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882326.45911: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882326.46126: Set connection var ansible_connection to ssh 18699 1726882326.46141: Set connection var ansible_pipelining to False 18699 1726882326.46152: Set connection var ansible_shell_executable to /bin/sh 18699 1726882326.46163: Set connection var ansible_timeout to 10 18699 1726882326.46170: Set connection var ansible_shell_type to sh 18699 1726882326.46209: Set connection var ansible_module_compression to ZIP_DEFLATED 18699 1726882326.46248: variable 'ansible_shell_executable' from source: unknown 18699 1726882326.46439: variable 'ansible_connection' from source: unknown 18699 1726882326.46442: variable 'ansible_module_compression' from source: unknown 18699 1726882326.46444: variable 'ansible_shell_type' from source: unknown 18699 1726882326.46447: variable 'ansible_shell_executable' from source: unknown 18699 1726882326.46451: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882326.46454: variable 'ansible_pipelining' from source: unknown 18699 1726882326.46457: variable 'ansible_timeout' from source: unknown 18699 1726882326.46460: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882326.46701: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18699 1726882326.46769: variable 'omit' from source: magic vars 18699 1726882326.46780: starting attempt loop 18699 1726882326.46786: running the handler 18699 1726882326.46811: variable 'ansible_facts' from source: unknown 18699 1726882326.46899: _low_level_execute_command(): starting 18699 1726882326.46916: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18699 1726882326.48425: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882326.48656: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882326.48670: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882326.48711: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882326.48792: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882326.50526: stdout chunk (state=3): >>>/root <<< 18699 1726882326.51099: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882326.51102: stdout chunk (state=3): >>><<< 18699 1726882326.51104: stderr chunk (state=3): >>><<< 18699 1726882326.51108: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882326.51110: _low_level_execute_command(): starting 18699 1726882326.51113: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882326.5104434-18736-32929865767353 `" && echo ansible-tmp-1726882326.5104434-18736-32929865767353="` echo /root/.ansible/tmp/ansible-tmp-1726882326.5104434-18736-32929865767353 `" ) && sleep 0' 18699 1726882326.52531: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882326.52551: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882326.52698: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882326.54605: stdout chunk (state=3): >>>ansible-tmp-1726882326.5104434-18736-32929865767353=/root/.ansible/tmp/ansible-tmp-1726882326.5104434-18736-32929865767353 <<< 18699 1726882326.54836: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882326.54840: stdout chunk (state=3): >>><<< 18699 1726882326.54842: stderr chunk (state=3): >>><<< 18699 1726882326.54845: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882326.5104434-18736-32929865767353=/root/.ansible/tmp/ansible-tmp-1726882326.5104434-18736-32929865767353 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882326.54848: variable 'ansible_module_compression' from source: unknown 18699 1726882326.54974: ANSIBALLZ: Using generic lock for ansible.legacy.setup 18699 1726882326.54978: ANSIBALLZ: Acquiring lock 18699 1726882326.54981: ANSIBALLZ: Lock acquired: 140254445799856 18699 1726882326.54983: ANSIBALLZ: Creating module 18699 1726882326.93975: ANSIBALLZ: Writing module into payload 18699 1726882326.94351: ANSIBALLZ: Writing module 18699 1726882326.94426: ANSIBALLZ: Renaming module 18699 1726882326.94453: ANSIBALLZ: Done creating module 18699 1726882326.94499: variable 'ansible_facts' from source: unknown 18699 1726882326.94564: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18699 1726882326.94588: _low_level_execute_command(): starting 18699 1726882326.94604: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 18699 1726882326.95998: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882326.96037: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882326.96097: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882326.97665: stdout chunk (state=3): >>>PLATFORM <<< 18699 1726882326.97877: stdout chunk (state=3): >>>Linux <<< 18699 1726882326.97881: stdout chunk (state=3): >>>FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 18699 1726882326.98018: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882326.98079: stderr chunk (state=3): >>><<< 18699 1726882326.98098: stdout chunk (state=3): >>><<< 18699 1726882326.98120: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882326.98136 [managed_node1]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 18699 1726882326.98403: _low_level_execute_command(): starting 18699 1726882326.98406: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 18699 1726882326.98548: Sending initial data 18699 1726882326.98610: Sent initial data (1181 bytes) 18699 1726882326.99628: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882326.99640: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882326.99840: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882326.99938: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882327.04121: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 18699 1726882327.04529: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882327.04705: stderr chunk (state=3): >>><<< 18699 1726882327.04709: stdout chunk (state=3): >>><<< 18699 1726882327.04713: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882327.04822: variable 'ansible_facts' from source: unknown 18699 1726882327.04832: variable 'ansible_facts' from source: unknown 18699 1726882327.04851: variable 'ansible_module_compression' from source: unknown 18699 1726882327.05300: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18699f6i6z5dg/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 18699 1726882327.05303: variable 'ansible_facts' from source: unknown 18699 1726882327.05306: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882326.5104434-18736-32929865767353/AnsiballZ_setup.py 18699 1726882327.05516: Sending initial data 18699 1726882327.05525: Sent initial data (153 bytes) 18699 1726882327.06045: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882327.06062: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882327.06076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882327.06097: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18699 1726882327.06115: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 18699 1726882327.06127: stderr chunk (state=3): >>>debug2: match not found <<< 18699 1726882327.06140: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882327.06157: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18699 1726882327.06168: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 18699 1726882327.06209: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882327.06263: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882327.06277: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882327.06304: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882327.06380: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 18699 1726882327.08658: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18699 1726882327.08702: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18699 1726882327.08951: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmp6jf9h9sy /root/.ansible/tmp/ansible-tmp-1726882326.5104434-18736-32929865767353/AnsiballZ_setup.py <<< 18699 1726882327.08955: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882326.5104434-18736-32929865767353/AnsiballZ_setup.py" <<< 18699 1726882327.08991: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmp6jf9h9sy" to remote "/root/.ansible/tmp/ansible-tmp-1726882326.5104434-18736-32929865767353/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882326.5104434-18736-32929865767353/AnsiballZ_setup.py" <<< 18699 1726882327.12806: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882327.12811: stdout chunk (state=3): >>><<< 18699 1726882327.12813: stderr chunk (state=3): >>><<< 18699 1726882327.12815: done transferring module to remote 18699 1726882327.12818: _low_level_execute_command(): starting 18699 1726882327.12820: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882326.5104434-18736-32929865767353/ /root/.ansible/tmp/ansible-tmp-1726882326.5104434-18736-32929865767353/AnsiballZ_setup.py && sleep 0' 18699 1726882327.14023: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882327.14039: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882327.14110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882327.14350: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882327.14374: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882327.14674: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 18699 1726882327.16690: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882327.16700: stdout chunk (state=3): >>><<< 18699 1726882327.16703: stderr chunk (state=3): >>><<< 18699 1726882327.16802: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 18699 1726882327.16806: _low_level_execute_command(): starting 18699 1726882327.16809: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882326.5104434-18736-32929865767353/AnsiballZ_setup.py && sleep 0' 18699 1726882327.17411: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882327.17420: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882327.17442: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882327.17460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18699 1726882327.17472: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 18699 1726882327.17479: stderr chunk (state=3): >>>debug2: match not found <<< 18699 1726882327.17489: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882327.17577: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18699 1726882327.17580: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 18699 1726882327.17583: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 18699 1726882327.17585: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882327.17587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882327.17589: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18699 1726882327.17591: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 18699 1726882327.17597: stderr chunk (state=3): >>>debug2: match found <<< 18699 1726882327.17599: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882327.17664: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882327.17734: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 18699 1726882327.20559: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # <<< 18699 1726882327.20646: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 18699 1726882327.20701: stdout chunk (state=3): >>>import 'posix' # <<< 18699 1726882327.20738: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 18699 1726882327.20773: stdout chunk (state=3): >>>import 'time' # <<< 18699 1726882327.20777: stdout chunk (state=3): >>>import 'zipimport' # <<< 18699 1726882327.20780: stdout chunk (state=3): >>># installed zipimport hook <<< 18699 1726882327.20864: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 18699 1726882327.20886: stdout chunk (state=3): >>>import '_codecs' # <<< 18699 1726882327.20917: stdout chunk (state=3): >>>import 'codecs' # <<< 18699 1726882327.21042: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec591104d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec590dfb30> <<< 18699 1726882327.21047: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 18699 1726882327.21071: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec59112a50> import '_signal' # <<< 18699 1726882327.21113: stdout chunk (state=3): >>>import '_abc' # <<< 18699 1726882327.21126: stdout chunk (state=3): >>>import 'abc' # <<< 18699 1726882327.21177: stdout chunk (state=3): >>>import 'io' # import '_stat' # <<< 18699 1726882327.21187: stdout chunk (state=3): >>>import 'stat' # <<< 18699 1726882327.21309: stdout chunk (state=3): >>>import '_collections_abc' # <<< 18699 1726882327.21380: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # import 'os' # <<< 18699 1726882327.21404: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 18699 1726882327.21429: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages <<< 18699 1726882327.21450: stdout chunk (state=3): >>>Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' <<< 18699 1726882327.21535: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58ec1130> <<< 18699 1726882327.21598: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 18699 1726882327.21611: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58ec1fa0> <<< 18699 1726882327.21651: stdout chunk (state=3): >>>import 'site' # <<< 18699 1726882327.21700: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 18699 1726882327.22329: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 18699 1726882327.22351: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 18699 1726882327.22387: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 18699 1726882327.22443: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 18699 1726882327.22462: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 18699 1726882327.22509: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58effd70> <<< 18699 1726882327.22530: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 18699 1726882327.22583: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58efffb0> <<< 18699 1726882327.22609: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 18699 1726882327.22669: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 18699 1726882327.22735: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 18699 1726882327.22759: stdout chunk (state=3): >>>import 'itertools' # <<< 18699 1726882327.22817: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58f37770> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 18699 1726882327.22829: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58f37e00> <<< 18699 1726882327.22914: stdout chunk (state=3): >>>import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58f17a40> <<< 18699 1726882327.22925: stdout chunk (state=3): >>>import '_functools' # <<< 18699 1726882327.22962: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58f15160> <<< 18699 1726882327.23098: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58efcf20> <<< 18699 1726882327.23130: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 18699 1726882327.23153: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 18699 1726882327.23164: stdout chunk (state=3): >>>import '_sre' # <<< 18699 1726882327.23198: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 18699 1726882327.23248: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py <<< 18699 1726882327.23259: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 18699 1726882327.23301: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58f576b0> <<< 18699 1726882327.23349: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58f562d0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py <<< 18699 1726882327.23361: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58f16030> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58f54b60> <<< 18699 1726882327.26314: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58f8c6b0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58efc1a0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec58f8cb60> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58f8ca10> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec58f8ce00> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58efacc0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58f8d4f0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58f8d1c0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58f8e3f0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58fa45f0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec58fa5cd0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58fa6b70> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec58fa71d0> impor<<< 18699 1726882327.26325: stdout chunk (state=3): >>>t 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58fa60c0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec58fa7c50> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58fa7380> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58f8e360> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec58ca3b60> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec58ccc590> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58ccc320> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec58ccc4d0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec58ccce60> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec58ccd760> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58ccc740> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58ca1d00> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58cceb70> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58ccd8b0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58f8eb10> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58cfaea0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58d1f230> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58d4bf50> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58d7e7b0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58d7c170> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58d1fec0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58b8d1c0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58d1e030> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58ccfa70> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fec58d1e6f0> <<< 18699 1726882327.26501: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_i7xamp3t/ansible_ansible.legacy.setup_payload.zip' <<< 18699 1726882327.26623: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.26725: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.26762: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 18699 1726882327.26796: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 18699 1726882327.26852: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 18699 1726882327.26967: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 18699 1726882327.27024: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py <<< 18699 1726882327.27035: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58beee70> <<< 18699 1726882327.27065: stdout chunk (state=3): >>>import '_typing' # <<< 18699 1726882327.27123: stdout chunk (state=3): >>> <<< 18699 1726882327.27365: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58bcdd60> <<< 18699 1726882327.27369: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58bccec0> <<< 18699 1726882327.27384: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.27414: stdout chunk (state=3): >>>import 'ansible' # <<< 18699 1726882327.27466: stdout chunk (state=3): >>> <<< 18699 1726882327.27470: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.27472: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.27478: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.27500: stdout chunk (state=3): >>>import 'ansible.module_utils' # <<< 18699 1726882327.27515: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.29716: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.31410: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py <<< 18699 1726882327.31438: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' <<< 18699 1726882327.31459: stdout chunk (state=3): >>>import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58becd40> <<< 18699 1726882327.31505: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py <<< 18699 1726882327.31530: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 18699 1726882327.31574: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 18699 1726882327.31605: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 18699 1726882327.31646: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py <<< 18699 1726882327.31669: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 18699 1726882327.31716: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so'<<< 18699 1726882327.31720: stdout chunk (state=3): >>> <<< 18699 1726882327.31747: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' <<< 18699 1726882327.31753: stdout chunk (state=3): >>>import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec58c268a0><<< 18699 1726882327.31819: stdout chunk (state=3): >>> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58c26630><<< 18699 1726882327.31825: stdout chunk (state=3): >>> <<< 18699 1726882327.31879: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58c25f40><<< 18699 1726882327.31886: stdout chunk (state=3): >>> <<< 18699 1726882327.31919: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py<<< 18699 1726882327.31926: stdout chunk (state=3): >>> <<< 18699 1726882327.31945: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc'<<< 18699 1726882327.32003: stdout chunk (state=3): >>> import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58c26390><<< 18699 1726882327.32011: stdout chunk (state=3): >>> <<< 18699 1726882327.32029: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58befb00> <<< 18699 1726882327.32082: stdout chunk (state=3): >>>import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' <<< 18699 1726882327.32110: stdout chunk (state=3): >>># extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' <<< 18699 1726882327.32154: stdout chunk (state=3): >>>import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec58c27650> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' <<< 18699 1726882327.32179: stdout chunk (state=3): >>># extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so'<<< 18699 1726882327.32183: stdout chunk (state=3): >>> import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec58c27890><<< 18699 1726882327.32227: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 18699 1726882327.32316: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # <<< 18699 1726882327.32400: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58c27dd0> import 'pwd' # <<< 18699 1726882327.32442: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py<<< 18699 1726882327.32445: stdout chunk (state=3): >>> <<< 18699 1726882327.32480: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc'<<< 18699 1726882327.32538: stdout chunk (state=3): >>> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58529b80> <<< 18699 1726882327.32585: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 18699 1726882327.32612: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 18699 1726882327.32654: stdout chunk (state=3): >>>import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec5852b7a0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py<<< 18699 1726882327.32664: stdout chunk (state=3): >>> <<< 18699 1726882327.32691: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc'<<< 18699 1726882327.32749: stdout chunk (state=3): >>> import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec5852c1a0><<< 18699 1726882327.32762: stdout chunk (state=3): >>> <<< 18699 1726882327.32791: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py<<< 18699 1726882327.32799: stdout chunk (state=3): >>> <<< 18699 1726882327.32840: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc'<<< 18699 1726882327.32874: stdout chunk (state=3): >>> import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec5852d340> <<< 18699 1726882327.32960: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc'<<< 18699 1726882327.32966: stdout chunk (state=3): >>> <<< 18699 1726882327.32998: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py <<< 18699 1726882327.33022: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc'<<< 18699 1726882327.33111: stdout chunk (state=3): >>> import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec5852fe00><<< 18699 1726882327.33116: stdout chunk (state=3): >>> <<< 18699 1726882327.33171: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so'<<< 18699 1726882327.33191: stdout chunk (state=3): >>> # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so'<<< 18699 1726882327.33199: stdout chunk (state=3): >>> <<< 18699 1726882327.33218: stdout chunk (state=3): >>>import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec58efadb0> <<< 18699 1726882327.33253: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec5852e0c0><<< 18699 1726882327.33259: stdout chunk (state=3): >>> <<< 18699 1726882327.33296: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py<<< 18699 1726882327.33339: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 18699 1726882327.33378: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py <<< 18699 1726882327.33430: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 18699 1726882327.33592: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc'<<< 18699 1726882327.33630: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py<<< 18699 1726882327.33644: stdout chunk (state=3): >>> <<< 18699 1726882327.33647: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc'<<< 18699 1726882327.33672: stdout chunk (state=3): >>> import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58537d40><<< 18699 1726882327.33700: stdout chunk (state=3): >>> import '_tokenize' # <<< 18699 1726882327.33705: stdout chunk (state=3): >>> <<< 18699 1726882327.33811: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58536810> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58536570> <<< 18699 1726882327.33843: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc'<<< 18699 1726882327.33911: stdout chunk (state=3): >>> <<< 18699 1726882327.33973: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58536ae0> <<< 18699 1726882327.34010: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec5852e5d0> <<< 18699 1726882327.34044: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' <<< 18699 1726882327.34053: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec5857ba10> <<< 18699 1726882327.34087: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec5857c1a0> <<< 18699 1726882327.34116: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 18699 1726882327.34141: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 18699 1726882327.34215: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec5857dc10> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec5857d9d0> <<< 18699 1726882327.34233: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 18699 1726882327.34272: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 18699 1726882327.34336: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 18699 1726882327.34339: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec58580170> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec5857e2d0> <<< 18699 1726882327.34523: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58583950> <<< 18699 1726882327.34712: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58580320> <<< 18699 1726882327.34795: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec585849b0> <<< 18699 1726882327.34839: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec58584b30> <<< 18699 1726882327.34902: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec58584a40> <<< 18699 1726882327.34933: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec5857c380> <<< 18699 1726882327.34965: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 18699 1726882327.34976: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 18699 1726882327.35010: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 18699 1726882327.35044: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 18699 1726882327.35073: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec5840c1d0> <<< 18699 1726882327.35310: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 18699 1726882327.35320: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 18699 1726882327.35331: stdout chunk (state=3): >>>import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec5840d1f0> <<< 18699 1726882327.35338: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58586960> <<< 18699 1726882327.35516: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec58587d10> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec585865a0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 18699 1726882327.35547: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.35681: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.35696: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # <<< 18699 1726882327.35719: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.35740: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 18699 1726882327.35766: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.35945: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.36134: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.37013: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.37887: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # <<< 18699 1726882327.37903: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 18699 1726882327.37939: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 18699 1726882327.37959: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 18699 1726882327.38024: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec58415430><<< 18699 1726882327.38030: stdout chunk (state=3): >>> <<< 18699 1726882327.38317: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec584161e0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec5840d430> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available <<< 18699 1726882327.38497: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.38741: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 18699 1726882327.38751: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 18699 1726882327.38754: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec584162d0> <<< 18699 1726882327.38773: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.39800: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.40242: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.40348: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.40464: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 18699 1726882327.40476: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.40525: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.40586: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 18699 1726882327.40590: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.40690: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.40819: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 18699 1726882327.40836: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.40866: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 18699 1726882327.40882: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.40934: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.40996: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 18699 1726882327.41000: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.41516: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.41734: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 18699 1726882327.41819: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 18699 1726882327.41832: stdout chunk (state=3): >>>import '_ast' # <<< 18699 1726882327.41926: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec584174a0> <<< 18699 1726882327.41936: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.42039: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.42214: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available <<< 18699 1726882327.42253: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.42323: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # # zipimport: zlib available <<< 18699 1726882327.42398: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.42442: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.42539: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.42614: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 18699 1726882327.42671: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 18699 1726882327.42781: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec58421f10> <<< 18699 1726882327.43022: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec5841d700> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available <<< 18699 1726882327.43052: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.43097: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.43148: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 18699 1726882327.43175: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 18699 1726882327.43222: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 18699 1726882327.43239: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 18699 1726882327.43329: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 18699 1726882327.43358: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 18699 1726882327.43451: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec5850a930> <<< 18699 1726882327.43507: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec585fe600> <<< 18699 1726882327.43628: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58422000> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58417080> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 18699 1726882327.43666: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.43681: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.43724: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 18699 1726882327.43813: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 18699 1726882327.43838: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.43861: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # <<< 18699 1726882327.44051: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 18699 1726882327.44054: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.44090: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.44097: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.44165: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.44502: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.44532: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 18699 1726882327.44568: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.44621: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 18699 1726882327.44638: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.44876: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.45155: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.45210: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.45290: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 18699 1726882327.45319: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 18699 1726882327.45522: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec584b5e80> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 18699 1726882327.45551: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' <<< 18699 1726882327.45575: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec580d3f50> <<< 18699 1726882327.45616: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 18699 1726882327.45640: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec580ec2c0> <<< 18699 1726882327.45700: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec5849f110> <<< 18699 1726882327.45728: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec584b6a20> <<< 18699 1726882327.45763: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec584b4530> <<< 18699 1726882327.45788: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec584b4950> <<< 18699 1726882327.45807: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 18699 1726882327.45865: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 18699 1726882327.45900: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py <<< 18699 1726882327.45919: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 18699 1726882327.45949: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py <<< 18699 1726882327.45981: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 18699 1726882327.46001: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec580ef290> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec580eeb40> <<< 18699 1726882327.46047: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' <<< 18699 1726882327.46055: stdout chunk (state=3): >>># extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec580eed20> <<< 18699 1726882327.46076: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec580edf70> <<< 18699 1726882327.46348: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py<<< 18699 1726882327.46358: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec580ef2f0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so'<<< 18699 1726882327.46524: stdout chunk (state=3): >>> # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec58141e20> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec580efe00> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec584b4230> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available <<< 18699 1726882327.46581: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.46671: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 18699 1726882327.46721: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.46791: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.46961: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 18699 1726882327.46973: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available <<< 18699 1726882327.46991: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.47033: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # <<< 18699 1726882327.47050: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.47114: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.47197: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available <<< 18699 1726882327.47259: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.47317: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available <<< 18699 1726882327.47418: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.47481: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.47558: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.47656: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # <<< 18699 1726882327.47668: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.48633: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.49343: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available <<< 18699 1726882327.49347: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # <<< 18699 1726882327.49350: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.49460: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available <<< 18699 1726882327.49620: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available <<< 18699 1726882327.49662: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.49749: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc'<<< 18699 1726882327.49769: stdout chunk (state=3): >>> <<< 18699 1726882327.49785: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58143b60> <<< 18699 1726882327.49806: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 18699 1726882327.49836: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 18699 1726882327.49951: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58142960> import 'ansible.module_utils.facts.system.local' # <<< 18699 1726882327.49972: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.50022: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.50086: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 18699 1726882327.50106: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.50190: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.50283: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 18699 1726882327.50308: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.50369: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.50445: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 18699 1726882327.50511: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 18699 1726882327.50554: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 18699 1726882327.50571: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 18699 1726882327.50940: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec5817a120> <<< 18699 1726882327.51042: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec5816af60> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available <<< 18699 1726882327.51103: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.51184: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 18699 1726882327.51203: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.51304: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.51420: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.51619: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.51810: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # <<< 18699 1726882327.51868: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 18699 1726882327.51923: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 18699 1726882327.51942: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.51978: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.52047: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py <<< 18699 1726882327.52102: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec5818d8e0> <<< 18699 1726882327.52115: stdout chunk (state=3): >>>import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec5816b140> import 'ansible.module_utils.facts.system.user' # <<< 18699 1726882327.52183: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available <<< 18699 1726882327.52257: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # <<< 18699 1726882327.52260: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.52498: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.52735: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available <<< 18699 1726882327.52905: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.53034: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.53073: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.53132: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # <<< 18699 1726882327.53171: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 18699 1726882327.53195: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.53391: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.53559: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available <<< 18699 1726882327.53779: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.53939: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 18699 1726882327.54513: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.55107: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available <<< 18699 1726882327.55192: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.55353: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 18699 1726882327.55356: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.55496: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.55658: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 18699 1726882327.55661: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.55880: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.56119: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 18699 1726882327.56129: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.56157: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available <<< 18699 1726882327.56219: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.56268: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 18699 1726882327.56271: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.56500: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.56566: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.56937: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.57198: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 18699 1726882327.57224: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.57255: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.57290: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 18699 1726882327.57322: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.57380: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # <<< 18699 1726882327.57383: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.57594: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # <<< 18699 1726882327.57601: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.57647: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.57778: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available <<< 18699 1726882327.57782: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available <<< 18699 1726882327.57882: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.58099: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available <<< 18699 1726882327.58188: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.58403: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 18699 1726882327.58433: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.58515: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.58536: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available <<< 18699 1726882327.58577: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.58723: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available <<< 18699 1726882327.58726: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 18699 1726882327.58729: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.58731: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.58786: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # <<< 18699 1726882327.58805: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.58961: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.59015: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available <<< 18699 1726882327.59044: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 18699 1726882327.59086: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.59131: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 18699 1726882327.59207: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 18699 1726882327.59255: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.59332: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # <<< 18699 1726882327.59351: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available <<< 18699 1726882327.59392: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.59508: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 18699 1726882327.59881: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available <<< 18699 1726882327.59933: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available <<< 18699 1726882327.59978: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.60031: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 18699 1726882327.60046: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.60116: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.60195: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 18699 1726882327.60256: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.60286: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.60380: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 18699 1726882327.60455: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882327.60632: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py <<< 18699 1726882327.60691: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 18699 1726882327.60723: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec57f8f0b0> <<< 18699 1726882327.60789: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec57f8ff80> <<< 18699 1726882327.60805: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec57f8d400> <<< 18699 1726882327.73039: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py <<< 18699 1726882327.73073: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec57fd49b0> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec57fd4b30> <<< 18699 1726882327.73325: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec57fd6000> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec57fd5a60> <<< 18699 1726882327.73414: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 18699 1726882328.00073: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 52586 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 52586 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_loadavg": {"1m": 0.4921875, "5m": 0.3232421875, "15m": 0.15673828125}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-159", "ansible_nodename": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d2d02cced42c36436217cb93f6b8e", "ansible_iscsi_iqn": "", "ansible_is_chroot": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2957, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 574, "free": 2957}, "nocache": {"free": 3295, "used": 236}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_uuid": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptim<<< 18699 1726882328.00107: stdout chunk (state=3): >>>e_seconds": 760, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794553856, "block_size": 4096, "block_total": 65519099, "block_available": 63914686, "block_used": 1604413, "inode_total": 131070960, "inode_available": 131029044, "inode_used": 41916, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_apparmor": {"status": "disabled"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC9sgyYGKGPd0JFIDKIZZNkcX78Ca8OmX4GnOCt150Ftpgzzfir9Dy2HOb7d6QbQheoi9HLkHb66U2LDdt7EnBGKnI12YAuydTDfqITc2L4W9cEeoy/f2rrMlBo6FN3SNQc2voCDsWius2gK2mtTTZZI0R33PguMmqTkwYVzP0hYplwSYh5Atl+XP7/xLRhhowanh9U6x2ahqfnNq5DInqi070bKk0xZ2g12Vg8kIRno8ZQmm+ujUUevRkZysHvnrnN01ZQhqzjo/Awn+Pft6LYleTBn+YU/HlPMWR4PsFcrtT3WRdF5samSvVwWuuOC+0td2zQN4nGpYLK+FmpNG4nDfGZV/xIBBblNRvzrhKgk3lDU5qkeQ/R0godRQGbv4J1kq+3WU2E3upqBYxXWUJLM5FirAxz8tKLmaPh8YZWMKcs3X9F2ySLEcnhe5R5F6LFSNx13zQSt7lGZOIgzhvWllcs4YVxcV1Y4rTJ8jEK2KgWua+bZinJPYUJqKTzO2E=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKk0X8hfHP7BSAAI8BDwrr4175ddN6MsanEqlp3oVMOvThKVXLpFXhvJPbq2IBTd3Wm12dL2vAW7/82zG63KYZk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDVN13dHSxa36Blsqt/Q8OyOA04CC7ZlvrS6zWL4aDyE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fips": false, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off<<< 18699 1726882328.00125: stdout chunk (state=3): >>> [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1030:bff:fea1:4223", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_<<< 18699 1726882328.00132: stdout chunk (state=3): >>>addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::1030:bff:fea1:4223"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1030:bff:fea1:4223"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_lsb": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_fibre_channel_wwn": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "32", "second": "07", "epoch": "1726882327", "epoch_int": "1726882327", "date": "2024-09-20", "time": "21:32:07", "iso8601_micro": "2024-09-21T01:32:07.996077Z", "iso8601": "2024-09-21T01:32:07Z", "iso8601_basic": "20240920T213207996077", "iso8601_basic_short": "20240920T213207", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_pkg_mgr": "dnf", "ansible_local": {}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 18699 1726882328.00937: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 <<< 18699 1726882328.00973: stdout chunk (state=3): >>># clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs <<< 18699 1726882328.01022: stdout chunk (state=3): >>># cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect <<< 18699 1726882328.01084: stdout chunk (state=3): >>># cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd <<< 18699 1726882328.01170: stdout chunk (state=3): >>># cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq <<< 18699 1726882328.01248: stdout chunk (state=3): >>># cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanu<<< 18699 1726882328.01252: stdout chunk (state=3): >>>p[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 18699 1726882328.01838: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress <<< 18699 1726882328.01852: stdout chunk (state=3): >>># destroy ntpath # destroy importlib # destroy zipimport <<< 18699 1726882328.01890: stdout chunk (state=3): >>># destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale <<< 18699 1726882328.02036: stdout chunk (state=3): >>># destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool <<< 18699 1726882328.02055: stdout chunk (state=3): >>># destroy signal # destroy pickle # destroy _compat_pickle <<< 18699 1726882328.02079: stdout chunk (state=3): >>># destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors <<< 18699 1726882328.02133: stdout chunk (state=3): >>># destroy shlex # destroy fcntl # destroy datetime # destroy subprocess <<< 18699 1726882328.02274: stdout chunk (state=3): >>># destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection <<< 18699 1726882328.02325: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize <<< 18699 1726882328.02351: stdout chunk (state=3): >>># cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings <<< 18699 1726882328.02379: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc <<< 18699 1726882328.02481: stdout chunk (state=3): >>># destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat <<< 18699 1726882328.02527: stdout chunk (state=3): >>># cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 18699 1726882328.02608: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 18699 1726882328.02736: stdout chunk (state=3): >>># destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 18699 1726882328.02833: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 18699 1726882328.02877: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading <<< 18699 1726882328.02889: stdout chunk (state=3): >>># destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time <<< 18699 1726882328.02936: stdout chunk (state=3): >>># destroy _random # destroy _weakref # destroy _hashlib <<< 18699 1726882328.02997: stdout chunk (state=3): >>># destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 18699 1726882328.03470: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 18699 1726882328.03473: stdout chunk (state=3): >>><<< 18699 1726882328.03476: stderr chunk (state=3): >>><<< 18699 1726882328.03952: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec591104d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec590dfb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec59112a50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58ec1130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58ec1fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58effd70> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58efffb0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58f37770> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58f37e00> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58f17a40> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58f15160> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58efcf20> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58f576b0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58f562d0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58f16030> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58f54b60> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58f8c6b0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58efc1a0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec58f8cb60> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58f8ca10> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec58f8ce00> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58efacc0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58f8d4f0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58f8d1c0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58f8e3f0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58fa45f0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec58fa5cd0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58fa6b70> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec58fa71d0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58fa60c0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec58fa7c50> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58fa7380> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58f8e360> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec58ca3b60> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec58ccc590> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58ccc320> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec58ccc4d0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec58ccce60> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec58ccd760> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58ccc740> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58ca1d00> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58cceb70> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58ccd8b0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58f8eb10> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58cfaea0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58d1f230> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58d4bf50> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58d7e7b0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58d7c170> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58d1fec0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58b8d1c0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58d1e030> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58ccfa70> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fec58d1e6f0> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_i7xamp3t/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58beee70> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58bcdd60> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58bccec0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58becd40> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec58c268a0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58c26630> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58c25f40> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58c26390> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58befb00> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec58c27650> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec58c27890> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58c27dd0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58529b80> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec5852b7a0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec5852c1a0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec5852d340> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec5852fe00> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec58efadb0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec5852e0c0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58537d40> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58536810> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58536570> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58536ae0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec5852e5d0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec5857ba10> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec5857c1a0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec5857dc10> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec5857d9d0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec58580170> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec5857e2d0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58583950> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58580320> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec585849b0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec58584b30> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec58584a40> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec5857c380> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec5840c1d0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec5840d1f0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58586960> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec58587d10> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec585865a0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec58415430> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec584161e0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec5840d430> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec584162d0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec584174a0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec58421f10> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec5841d700> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec5850a930> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec585fe600> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58422000> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58417080> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec584b5e80> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec580d3f50> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec580ec2c0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec5849f110> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec584b6a20> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec584b4530> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec584b4950> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec580ef290> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec580eeb40> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec580eed20> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec580edf70> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec580ef2f0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec58141e20> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec580efe00> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec584b4230> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58143b60> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec58142960> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec5817a120> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec5816af60> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec5818d8e0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec5816b140> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fec57f8f0b0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec57f8ff80> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec57f8d400> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec57fd49b0> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec57fd4b30> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec57fd6000> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fec57fd5a60> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 52586 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 52586 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_loadavg": {"1m": 0.4921875, "5m": 0.3232421875, "15m": 0.15673828125}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-159", "ansible_nodename": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d2d02cced42c36436217cb93f6b8e", "ansible_iscsi_iqn": "", "ansible_is_chroot": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2957, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 574, "free": 2957}, "nocache": {"free": 3295, "used": 236}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_uuid": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 760, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794553856, "block_size": 4096, "block_total": 65519099, "block_available": 63914686, "block_used": 1604413, "inode_total": 131070960, "inode_available": 131029044, "inode_used": 41916, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_apparmor": {"status": "disabled"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC9sgyYGKGPd0JFIDKIZZNkcX78Ca8OmX4GnOCt150Ftpgzzfir9Dy2HOb7d6QbQheoi9HLkHb66U2LDdt7EnBGKnI12YAuydTDfqITc2L4W9cEeoy/f2rrMlBo6FN3SNQc2voCDsWius2gK2mtTTZZI0R33PguMmqTkwYVzP0hYplwSYh5Atl+XP7/xLRhhowanh9U6x2ahqfnNq5DInqi070bKk0xZ2g12Vg8kIRno8ZQmm+ujUUevRkZysHvnrnN01ZQhqzjo/Awn+Pft6LYleTBn+YU/HlPMWR4PsFcrtT3WRdF5samSvVwWuuOC+0td2zQN4nGpYLK+FmpNG4nDfGZV/xIBBblNRvzrhKgk3lDU5qkeQ/R0godRQGbv4J1kq+3WU2E3upqBYxXWUJLM5FirAxz8tKLmaPh8YZWMKcs3X9F2ySLEcnhe5R5F6LFSNx13zQSt7lGZOIgzhvWllcs4YVxcV1Y4rTJ8jEK2KgWua+bZinJPYUJqKTzO2E=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKk0X8hfHP7BSAAI8BDwrr4175ddN6MsanEqlp3oVMOvThKVXLpFXhvJPbq2IBTd3Wm12dL2vAW7/82zG63KYZk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDVN13dHSxa36Blsqt/Q8OyOA04CC7ZlvrS6zWL4aDyE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fips": false, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1030:bff:fea1:4223", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::1030:bff:fea1:4223"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1030:bff:fea1:4223"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_lsb": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_fibre_channel_wwn": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "32", "second": "07", "epoch": "1726882327", "epoch_int": "1726882327", "date": "2024-09-20", "time": "21:32:07", "iso8601_micro": "2024-09-21T01:32:07.996077Z", "iso8601": "2024-09-21T01:32:07Z", "iso8601_basic": "20240920T213207996077", "iso8601_basic_short": "20240920T213207", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_pkg_mgr": "dnf", "ansible_local": {}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed_node1 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 18699 1726882328.07009: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882326.5104434-18736-32929865767353/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18699 1726882328.07012: _low_level_execute_command(): starting 18699 1726882328.07014: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882326.5104434-18736-32929865767353/ > /dev/null 2>&1 && sleep 0' 18699 1726882328.07016: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882328.07018: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882328.07020: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882328.07022: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882328.07025: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882328.07027: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882328.07029: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882328.07031: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882328.07147: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882328.08851: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882328.08854: stdout chunk (state=3): >>><<< 18699 1726882328.08861: stderr chunk (state=3): >>><<< 18699 1726882328.08900: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882328.08956: handler run complete 18699 1726882328.09227: variable 'ansible_facts' from source: unknown 18699 1726882328.09319: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882328.10033: variable 'ansible_facts' from source: unknown 18699 1726882328.10342: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882328.10418: attempt loop complete, returning result 18699 1726882328.10566: _execute() done 18699 1726882328.10573: dumping result to json 18699 1726882328.10607: done dumping result, returning 18699 1726882328.10619: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [12673a56-9f93-1ce6-d207-00000000007c] 18699 1726882328.10626: sending task result for task 12673a56-9f93-1ce6-d207-00000000007c 18699 1726882328.11697: done sending task result for task 12673a56-9f93-1ce6-d207-00000000007c 18699 1726882328.11702: WORKER PROCESS EXITING ok: [managed_node1] 18699 1726882328.12274: no more pending results, returning what we have 18699 1726882328.12277: results queue empty 18699 1726882328.12278: checking for any_errors_fatal 18699 1726882328.12279: done checking for any_errors_fatal 18699 1726882328.12280: checking for max_fail_percentage 18699 1726882328.12281: done checking for max_fail_percentage 18699 1726882328.12282: checking to see if all hosts have failed and the running result is not ok 18699 1726882328.12282: done checking to see if all hosts have failed 18699 1726882328.12283: getting the remaining hosts for this loop 18699 1726882328.12285: done getting the remaining hosts for this loop 18699 1726882328.12288: getting the next task for host managed_node1 18699 1726882328.12301: done getting next task for host managed_node1 18699 1726882328.12303: ^ task is: TASK: meta (flush_handlers) 18699 1726882328.12305: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882328.12308: getting variables 18699 1726882328.12309: in VariableManager get_vars() 18699 1726882328.12329: Calling all_inventory to load vars for managed_node1 18699 1726882328.12332: Calling groups_inventory to load vars for managed_node1 18699 1726882328.12335: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882328.12343: Calling all_plugins_play to load vars for managed_node1 18699 1726882328.12346: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882328.12348: Calling groups_plugins_play to load vars for managed_node1 18699 1726882328.12650: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882328.13037: done with get_vars() 18699 1726882328.13048: done getting variables 18699 1726882328.13318: in VariableManager get_vars() 18699 1726882328.13327: Calling all_inventory to load vars for managed_node1 18699 1726882328.13330: Calling groups_inventory to load vars for managed_node1 18699 1726882328.13332: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882328.13336: Calling all_plugins_play to load vars for managed_node1 18699 1726882328.13339: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882328.13341: Calling groups_plugins_play to load vars for managed_node1 18699 1726882328.13474: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882328.13865: done with get_vars() 18699 1726882328.13878: done queuing things up, now waiting for results queue to drain 18699 1726882328.13880: results queue empty 18699 1726882328.13881: checking for any_errors_fatal 18699 1726882328.13883: done checking for any_errors_fatal 18699 1726882328.13884: checking for max_fail_percentage 18699 1726882328.13890: done checking for max_fail_percentage 18699 1726882328.13890: checking to see if all hosts have failed and the running result is not ok 18699 1726882328.13891: done checking to see if all hosts have failed 18699 1726882328.13892: getting the remaining hosts for this loop 18699 1726882328.14197: done getting the remaining hosts for this loop 18699 1726882328.14201: getting the next task for host managed_node1 18699 1726882328.14206: done getting next task for host managed_node1 18699 1726882328.14208: ^ task is: TASK: Include the task 'el_repo_setup.yml' 18699 1726882328.14210: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882328.14212: getting variables 18699 1726882328.14213: in VariableManager get_vars() 18699 1726882328.14222: Calling all_inventory to load vars for managed_node1 18699 1726882328.14224: Calling groups_inventory to load vars for managed_node1 18699 1726882328.14226: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882328.14231: Calling all_plugins_play to load vars for managed_node1 18699 1726882328.14233: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882328.14236: Calling groups_plugins_play to load vars for managed_node1 18699 1726882328.14387: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882328.14776: done with get_vars() 18699 1726882328.14784: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_nm.yml:11 Friday 20 September 2024 21:32:08 -0400 (0:00:01.718) 0:00:01.746 ****** 18699 1726882328.15062: entering _queue_task() for managed_node1/include_tasks 18699 1726882328.15064: Creating lock for include_tasks 18699 1726882328.15484: worker is 1 (out of 1 available) 18699 1726882328.16298: exiting _queue_task() for managed_node1/include_tasks 18699 1726882328.16306: done queuing things up, now waiting for results queue to drain 18699 1726882328.16307: waiting for pending results... 18699 1726882328.16366: running TaskExecutor() for managed_node1/TASK: Include the task 'el_repo_setup.yml' 18699 1726882328.16372: in run() - task 12673a56-9f93-1ce6-d207-000000000006 18699 1726882328.16375: variable 'ansible_search_path' from source: unknown 18699 1726882328.16377: calling self._execute() 18699 1726882328.16683: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882328.16687: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882328.16690: variable 'omit' from source: magic vars 18699 1726882328.16889: _execute() done 18699 1726882328.17120: dumping result to json 18699 1726882328.17124: done dumping result, returning 18699 1726882328.17126: done running TaskExecutor() for managed_node1/TASK: Include the task 'el_repo_setup.yml' [12673a56-9f93-1ce6-d207-000000000006] 18699 1726882328.17128: sending task result for task 12673a56-9f93-1ce6-d207-000000000006 18699 1726882328.17213: done sending task result for task 12673a56-9f93-1ce6-d207-000000000006 18699 1726882328.17217: WORKER PROCESS EXITING 18699 1726882328.17340: no more pending results, returning what we have 18699 1726882328.17344: in VariableManager get_vars() 18699 1726882328.17375: Calling all_inventory to load vars for managed_node1 18699 1726882328.17378: Calling groups_inventory to load vars for managed_node1 18699 1726882328.17381: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882328.17391: Calling all_plugins_play to load vars for managed_node1 18699 1726882328.17397: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882328.17401: Calling groups_plugins_play to load vars for managed_node1 18699 1726882328.17864: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882328.18668: done with get_vars() 18699 1726882328.18676: variable 'ansible_search_path' from source: unknown 18699 1726882328.18689: we have included files to process 18699 1726882328.18690: generating all_blocks data 18699 1726882328.18691: done generating all_blocks data 18699 1726882328.18692: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 18699 1726882328.18699: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 18699 1726882328.18702: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 18699 1726882328.20927: in VariableManager get_vars() 18699 1726882328.20944: done with get_vars() 18699 1726882328.20956: done processing included file 18699 1726882328.20958: iterating over new_blocks loaded from include file 18699 1726882328.20959: in VariableManager get_vars() 18699 1726882328.20968: done with get_vars() 18699 1726882328.20969: filtering new block on tags 18699 1726882328.20982: done filtering new block on tags 18699 1726882328.20985: in VariableManager get_vars() 18699 1726882328.21603: done with get_vars() 18699 1726882328.21605: filtering new block on tags 18699 1726882328.21623: done filtering new block on tags 18699 1726882328.21626: in VariableManager get_vars() 18699 1726882328.21638: done with get_vars() 18699 1726882328.21639: filtering new block on tags 18699 1726882328.21652: done filtering new block on tags 18699 1726882328.21654: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node1 18699 1726882328.21660: extending task lists for all hosts with included blocks 18699 1726882328.21709: done extending task lists 18699 1726882328.21711: done processing included files 18699 1726882328.21712: results queue empty 18699 1726882328.21712: checking for any_errors_fatal 18699 1726882328.21713: done checking for any_errors_fatal 18699 1726882328.21714: checking for max_fail_percentage 18699 1726882328.21715: done checking for max_fail_percentage 18699 1726882328.21716: checking to see if all hosts have failed and the running result is not ok 18699 1726882328.21716: done checking to see if all hosts have failed 18699 1726882328.21717: getting the remaining hosts for this loop 18699 1726882328.21718: done getting the remaining hosts for this loop 18699 1726882328.21720: getting the next task for host managed_node1 18699 1726882328.21724: done getting next task for host managed_node1 18699 1726882328.21726: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 18699 1726882328.21728: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882328.21730: getting variables 18699 1726882328.21731: in VariableManager get_vars() 18699 1726882328.21739: Calling all_inventory to load vars for managed_node1 18699 1726882328.21741: Calling groups_inventory to load vars for managed_node1 18699 1726882328.21743: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882328.21749: Calling all_plugins_play to load vars for managed_node1 18699 1726882328.21751: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882328.21753: Calling groups_plugins_play to load vars for managed_node1 18699 1726882328.22249: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882328.22836: done with get_vars() 18699 1726882328.22846: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Friday 20 September 2024 21:32:08 -0400 (0:00:00.082) 0:00:01.829 ****** 18699 1726882328.23322: entering _queue_task() for managed_node1/setup 18699 1726882328.24048: worker is 1 (out of 1 available) 18699 1726882328.24059: exiting _queue_task() for managed_node1/setup 18699 1726882328.24071: done queuing things up, now waiting for results queue to drain 18699 1726882328.24072: waiting for pending results... 18699 1726882328.25119: running TaskExecutor() for managed_node1/TASK: Gather the minimum subset of ansible_facts required by the network role test 18699 1726882328.25126: in run() - task 12673a56-9f93-1ce6-d207-00000000008d 18699 1726882328.25130: variable 'ansible_search_path' from source: unknown 18699 1726882328.25133: variable 'ansible_search_path' from source: unknown 18699 1726882328.25135: calling self._execute() 18699 1726882328.25478: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882328.25550: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882328.25664: variable 'omit' from source: magic vars 18699 1726882328.27217: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18699 1726882328.32408: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18699 1726882328.32487: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18699 1726882328.32617: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18699 1726882328.32668: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18699 1726882328.32905: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18699 1726882328.33013: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882328.33062: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882328.33108: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882328.33312: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882328.33475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882328.33883: variable 'ansible_facts' from source: unknown 18699 1726882328.34280: variable 'network_test_required_facts' from source: task vars 18699 1726882328.34327: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): False 18699 1726882328.34399: when evaluation is False, skipping this task 18699 1726882328.34407: _execute() done 18699 1726882328.34710: dumping result to json 18699 1726882328.34713: done dumping result, returning 18699 1726882328.34716: done running TaskExecutor() for managed_node1/TASK: Gather the minimum subset of ansible_facts required by the network role test [12673a56-9f93-1ce6-d207-00000000008d] 18699 1726882328.34718: sending task result for task 12673a56-9f93-1ce6-d207-00000000008d 18699 1726882328.34790: done sending task result for task 12673a56-9f93-1ce6-d207-00000000008d 18699 1726882328.34792: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts", "skip_reason": "Conditional result was False" } 18699 1726882328.34858: no more pending results, returning what we have 18699 1726882328.34861: results queue empty 18699 1726882328.34862: checking for any_errors_fatal 18699 1726882328.34864: done checking for any_errors_fatal 18699 1726882328.34864: checking for max_fail_percentage 18699 1726882328.34865: done checking for max_fail_percentage 18699 1726882328.34866: checking to see if all hosts have failed and the running result is not ok 18699 1726882328.34867: done checking to see if all hosts have failed 18699 1726882328.34868: getting the remaining hosts for this loop 18699 1726882328.34869: done getting the remaining hosts for this loop 18699 1726882328.34873: getting the next task for host managed_node1 18699 1726882328.34880: done getting next task for host managed_node1 18699 1726882328.34883: ^ task is: TASK: Check if system is ostree 18699 1726882328.34885: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882328.34889: getting variables 18699 1726882328.34890: in VariableManager get_vars() 18699 1726882328.34923: Calling all_inventory to load vars for managed_node1 18699 1726882328.34926: Calling groups_inventory to load vars for managed_node1 18699 1726882328.34930: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882328.34941: Calling all_plugins_play to load vars for managed_node1 18699 1726882328.34945: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882328.34951: Calling groups_plugins_play to load vars for managed_node1 18699 1726882328.35547: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882328.35828: done with get_vars() 18699 1726882328.35838: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Friday 20 September 2024 21:32:08 -0400 (0:00:00.128) 0:00:01.957 ****** 18699 1726882328.36130: entering _queue_task() for managed_node1/stat 18699 1726882328.36573: worker is 1 (out of 1 available) 18699 1726882328.36587: exiting _queue_task() for managed_node1/stat 18699 1726882328.36800: done queuing things up, now waiting for results queue to drain 18699 1726882328.36802: waiting for pending results... 18699 1726882328.37023: running TaskExecutor() for managed_node1/TASK: Check if system is ostree 18699 1726882328.37352: in run() - task 12673a56-9f93-1ce6-d207-00000000008f 18699 1726882328.37357: variable 'ansible_search_path' from source: unknown 18699 1726882328.37359: variable 'ansible_search_path' from source: unknown 18699 1726882328.37363: calling self._execute() 18699 1726882328.37407: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882328.37423: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882328.37442: variable 'omit' from source: magic vars 18699 1726882328.37957: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18699 1726882328.38242: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18699 1726882328.38292: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18699 1726882328.38342: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18699 1726882328.38400: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18699 1726882328.38490: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18699 1726882328.38522: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18699 1726882328.38565: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882328.38598: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18699 1726882328.38728: Evaluated conditional (not __network_is_ostree is defined): True 18699 1726882328.38739: variable 'omit' from source: magic vars 18699 1726882328.38792: variable 'omit' from source: magic vars 18699 1726882328.38837: variable 'omit' from source: magic vars 18699 1726882328.38977: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18699 1726882328.38982: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18699 1726882328.38985: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18699 1726882328.38987: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882328.38990: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882328.39025: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18699 1726882328.39034: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882328.39043: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882328.39149: Set connection var ansible_connection to ssh 18699 1726882328.39163: Set connection var ansible_pipelining to False 18699 1726882328.39175: Set connection var ansible_shell_executable to /bin/sh 18699 1726882328.39186: Set connection var ansible_timeout to 10 18699 1726882328.39223: Set connection var ansible_shell_type to sh 18699 1726882328.39226: Set connection var ansible_module_compression to ZIP_DEFLATED 18699 1726882328.39256: variable 'ansible_shell_executable' from source: unknown 18699 1726882328.39301: variable 'ansible_connection' from source: unknown 18699 1726882328.39305: variable 'ansible_module_compression' from source: unknown 18699 1726882328.39307: variable 'ansible_shell_type' from source: unknown 18699 1726882328.39309: variable 'ansible_shell_executable' from source: unknown 18699 1726882328.39311: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882328.39313: variable 'ansible_pipelining' from source: unknown 18699 1726882328.39315: variable 'ansible_timeout' from source: unknown 18699 1726882328.39318: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882328.39453: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18699 1726882328.39468: variable 'omit' from source: magic vars 18699 1726882328.39478: starting attempt loop 18699 1726882328.39501: running the handler 18699 1726882328.39507: _low_level_execute_command(): starting 18699 1726882328.39521: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18699 1726882328.40906: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882328.40937: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882328.40960: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882328.40978: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882328.41070: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882328.42678: stdout chunk (state=3): >>>/root <<< 18699 1726882328.42817: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882328.42832: stdout chunk (state=3): >>><<< 18699 1726882328.42847: stderr chunk (state=3): >>><<< 18699 1726882328.42882: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882328.42913: _low_level_execute_command(): starting 18699 1726882328.43001: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882328.4289849-18827-78238000143948 `" && echo ansible-tmp-1726882328.4289849-18827-78238000143948="` echo /root/.ansible/tmp/ansible-tmp-1726882328.4289849-18827-78238000143948 `" ) && sleep 0' 18699 1726882328.43559: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882328.43573: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882328.43586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882328.43614: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18699 1726882328.43725: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 18699 1726882328.43729: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882328.43758: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882328.43829: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882328.45679: stdout chunk (state=3): >>>ansible-tmp-1726882328.4289849-18827-78238000143948=/root/.ansible/tmp/ansible-tmp-1726882328.4289849-18827-78238000143948 <<< 18699 1726882328.45832: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882328.45835: stdout chunk (state=3): >>><<< 18699 1726882328.45837: stderr chunk (state=3): >>><<< 18699 1726882328.45953: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882328.4289849-18827-78238000143948=/root/.ansible/tmp/ansible-tmp-1726882328.4289849-18827-78238000143948 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882328.45956: variable 'ansible_module_compression' from source: unknown 18699 1726882328.45988: ANSIBALLZ: Using lock for stat 18699 1726882328.45999: ANSIBALLZ: Acquiring lock 18699 1726882328.46007: ANSIBALLZ: Lock acquired: 140254446049984 18699 1726882328.46014: ANSIBALLZ: Creating module 18699 1726882328.56796: ANSIBALLZ: Writing module into payload 18699 1726882328.56858: ANSIBALLZ: Writing module 18699 1726882328.56882: ANSIBALLZ: Renaming module 18699 1726882328.56885: ANSIBALLZ: Done creating module 18699 1726882328.56903: variable 'ansible_facts' from source: unknown 18699 1726882328.56947: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882328.4289849-18827-78238000143948/AnsiballZ_stat.py 18699 1726882328.57056: Sending initial data 18699 1726882328.57060: Sent initial data (152 bytes) 18699 1726882328.57474: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882328.57477: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882328.57480: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882328.57482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 18699 1726882328.57484: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882328.57538: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882328.57543: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882328.57588: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 18699 1726882328.59752: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 18699 1726882328.59756: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18699 1726882328.59809: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18699 1726882328.59849: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmpy8k8igpw /root/.ansible/tmp/ansible-tmp-1726882328.4289849-18827-78238000143948/AnsiballZ_stat.py <<< 18699 1726882328.59852: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882328.4289849-18827-78238000143948/AnsiballZ_stat.py" <<< 18699 1726882328.59892: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmpy8k8igpw" to remote "/root/.ansible/tmp/ansible-tmp-1726882328.4289849-18827-78238000143948/AnsiballZ_stat.py" <<< 18699 1726882328.59901: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882328.4289849-18827-78238000143948/AnsiballZ_stat.py" <<< 18699 1726882328.60473: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882328.60507: stderr chunk (state=3): >>><<< 18699 1726882328.60511: stdout chunk (state=3): >>><<< 18699 1726882328.60527: done transferring module to remote 18699 1726882328.60538: _low_level_execute_command(): starting 18699 1726882328.60543: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882328.4289849-18827-78238000143948/ /root/.ansible/tmp/ansible-tmp-1726882328.4289849-18827-78238000143948/AnsiballZ_stat.py && sleep 0' 18699 1726882328.60962: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882328.60965: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882328.60968: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 18699 1726882328.60970: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18699 1726882328.60972: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882328.61021: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882328.61024: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882328.61076: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 18699 1726882328.63600: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882328.63604: stderr chunk (state=3): >>><<< 18699 1726882328.63606: stdout chunk (state=3): >>><<< 18699 1726882328.63614: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 18699 1726882328.63623: _low_level_execute_command(): starting 18699 1726882328.63633: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882328.4289849-18827-78238000143948/AnsiballZ_stat.py && sleep 0' 18699 1726882328.64276: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882328.64292: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882328.64352: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882328.64423: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882328.64451: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882328.64557: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 18699 1726882328.67741: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # <<< 18699 1726882328.67831: stdout chunk (state=3): >>>import '_io' # <<< 18699 1726882328.67835: stdout chunk (state=3): >>>import 'marshal' # <<< 18699 1726882328.67900: stdout chunk (state=3): >>> import 'posix' # <<< 18699 1726882328.67959: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 18699 1726882328.67982: stdout chunk (state=3): >>># installing zipimport hook <<< 18699 1726882328.68016: stdout chunk (state=3): >>>import 'time' # <<< 18699 1726882328.68098: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 18699 1726882328.68127: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 18699 1726882328.68169: stdout chunk (state=3): >>>import '_codecs' # <<< 18699 1726882328.68172: stdout chunk (state=3): >>> <<< 18699 1726882328.68208: stdout chunk (state=3): >>>import 'codecs' # <<< 18699 1726882328.68263: stdout chunk (state=3): >>> # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py<<< 18699 1726882328.68278: stdout chunk (state=3): >>> <<< 18699 1726882328.68307: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 18699 1726882328.68339: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7c184d0><<< 18699 1726882328.68351: stdout chunk (state=3): >>> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7be7b30><<< 18699 1726882328.68392: stdout chunk (state=3): >>> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py <<< 18699 1726882328.68416: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc'<<< 18699 1726882328.68434: stdout chunk (state=3): >>> import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7c1aa50><<< 18699 1726882328.68479: stdout chunk (state=3): >>> import '_signal' # <<< 18699 1726882328.68491: stdout chunk (state=3): >>> <<< 18699 1726882328.68513: stdout chunk (state=3): >>>import '_abc' # <<< 18699 1726882328.68558: stdout chunk (state=3): >>> import 'abc' # import 'io' # <<< 18699 1726882328.68569: stdout chunk (state=3): >>> <<< 18699 1726882328.68610: stdout chunk (state=3): >>>import '_stat' # <<< 18699 1726882328.68633: stdout chunk (state=3): >>> import 'stat' # <<< 18699 1726882328.68714: stdout chunk (state=3): >>> <<< 18699 1726882328.68775: stdout chunk (state=3): >>>import '_collections_abc' # <<< 18699 1726882328.68826: stdout chunk (state=3): >>>import 'genericpath' # <<< 18699 1726882328.68853: stdout chunk (state=3): >>>import 'posixpath' # <<< 18699 1726882328.68953: stdout chunk (state=3): >>>import 'os' # import '_sitebuiltins' # <<< 18699 1726882328.68984: stdout chunk (state=3): >>> Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages'<<< 18699 1726882328.69018: stdout chunk (state=3): >>> Adding directory: '/usr/lib/python3.12/site-packages'<<< 18699 1726882328.69034: stdout chunk (state=3): >>> Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth'<<< 18699 1726882328.69083: stdout chunk (state=3): >>> # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py <<< 18699 1726882328.69114: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc'<<< 18699 1726882328.69137: stdout chunk (state=3): >>> import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a79c9130> <<< 18699 1726882328.69213: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py<<< 18699 1726882328.69234: stdout chunk (state=3): >>> # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc'<<< 18699 1726882328.69271: stdout chunk (state=3): >>> import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a79c9fa0><<< 18699 1726882328.69284: stdout chunk (state=3): >>> <<< 18699 1726882328.69318: stdout chunk (state=3): >>>import 'site' # <<< 18699 1726882328.69526: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 18699 1726882328.69752: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 18699 1726882328.69780: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 18699 1726882328.69838: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py<<< 18699 1726882328.69841: stdout chunk (state=3): >>> <<< 18699 1726882328.69898: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py<<< 18699 1726882328.69901: stdout chunk (state=3): >>> <<< 18699 1726882328.69973: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 18699 1726882328.70005: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 18699 1726882328.70057: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 18699 1726882328.70092: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7a07e30> <<< 18699 1726882328.70124: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py<<< 18699 1726882328.70139: stdout chunk (state=3): >>> <<< 18699 1726882328.70156: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc'<<< 18699 1726882328.70204: stdout chunk (state=3): >>> import '_operator' # <<< 18699 1726882328.70242: stdout chunk (state=3): >>>import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7a07ef0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 18699 1726882328.70288: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc'<<< 18699 1726882328.70333: stdout chunk (state=3): >>> # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 18699 1726882328.70428: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 18699 1726882328.70470: stdout chunk (state=3): >>>import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py <<< 18699 1726882328.70490: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7a3f860> <<< 18699 1726882328.70564: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc'<<< 18699 1726882328.70585: stdout chunk (state=3): >>> import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7a3fef0> import '_collections' # <<< 18699 1726882328.70677: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7a1fb00> import '_functools' # <<< 18699 1726882328.70721: stdout chunk (state=3): >>> import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7a1d220> <<< 18699 1726882328.70880: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7a04fe0> <<< 18699 1726882328.70946: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 18699 1726882328.70977: stdout chunk (state=3): >>>import '_sre' # <<< 18699 1726882328.71013: stdout chunk (state=3): >>> # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 18699 1726882328.71065: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc'<<< 18699 1726882328.71101: stdout chunk (state=3): >>> # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py <<< 18699 1726882328.71398: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7a5f800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7a5e420> <<< 18699 1726882328.71436: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7a1e0f0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7a5cc80> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7a94830> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7a04260> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py<<< 18699 1726882328.71454: stdout chunk (state=3): >>> <<< 18699 1726882328.71479: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so'<<< 18699 1726882328.71503: stdout chunk (state=3): >>> <<< 18699 1726882328.71526: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 18699 1726882328.71548: stdout chunk (state=3): >>>import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7a7a94ce0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7a94b90><<< 18699 1726882328.71580: stdout chunk (state=3): >>> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 18699 1726882328.71610: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 18699 1726882328.71641: stdout chunk (state=3): >>>import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7a7a94f80> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7a02d80><<< 18699 1726882328.71697: stdout chunk (state=3): >>> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py<<< 18699 1726882328.71700: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc'<<< 18699 1726882328.71718: stdout chunk (state=3): >>> <<< 18699 1726882328.71749: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py<<< 18699 1726882328.71861: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 18699 1726882328.71867: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7a95670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7a95340> import 'importlib.machinery' # <<< 18699 1726882328.71914: stdout chunk (state=3): >>> # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py <<< 18699 1726882328.71924: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 18699 1726882328.71999: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7a96570> import 'importlib.util' # <<< 18699 1726882328.72002: stdout chunk (state=3): >>> import 'runpy' # <<< 18699 1726882328.72082: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 18699 1726882328.72124: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py <<< 18699 1726882328.72143: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc'<<< 18699 1726882328.72176: stdout chunk (state=3): >>> import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7aac770> import 'errno' # <<< 18699 1726882328.72192: stdout chunk (state=3): >>> <<< 18699 1726882328.72229: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so'<<< 18699 1726882328.72264: stdout chunk (state=3): >>> import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7a7aade50> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py<<< 18699 1726882328.72273: stdout chunk (state=3): >>> <<< 18699 1726882328.72316: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py <<< 18699 1726882328.72362: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7aaecf0><<< 18699 1726882328.72376: stdout chunk (state=3): >>> <<< 18699 1726882328.72425: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7a7aaf320> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7aae240><<< 18699 1726882328.72455: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py<<< 18699 1726882328.72527: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 18699 1726882328.72539: stdout chunk (state=3): >>>import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7a7aafda0><<< 18699 1726882328.72566: stdout chunk (state=3): >>> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7aaf4d0> <<< 18699 1726882328.72654: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7a964e0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py<<< 18699 1726882328.72672: stdout chunk (state=3): >>> <<< 18699 1726882328.72697: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc'<<< 18699 1726882328.72738: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 18699 1726882328.72814: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' <<< 18699 1726882328.72828: stdout chunk (state=3): >>># extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so'<<< 18699 1726882328.72873: stdout chunk (state=3): >>> import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7a7837cb0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py<<< 18699 1726882328.72898: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc'<<< 18699 1726882328.72947: stdout chunk (state=3): >>> # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' <<< 18699 1726882328.72950: stdout chunk (state=3): >>># extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so'<<< 18699 1726882328.72972: stdout chunk (state=3): >>> import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7a7860710> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7860470><<< 18699 1726882328.73026: stdout chunk (state=3): >>> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so'<<< 18699 1726882328.73045: stdout chunk (state=3): >>> # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so'<<< 18699 1726882328.73054: stdout chunk (state=3): >>> import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7a7860740><<< 18699 1726882328.73099: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py<<< 18699 1726882328.73121: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 18699 1726882328.73218: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so'<<< 18699 1726882328.73408: stdout chunk (state=3): >>> # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 18699 1726882328.73432: stdout chunk (state=3): >>>import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7a7861070> <<< 18699 1726882328.73596: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so'<<< 18699 1726882328.73600: stdout chunk (state=3): >>> <<< 18699 1726882328.73628: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7a7861a60> <<< 18699 1726882328.73662: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7860920><<< 18699 1726882328.73694: stdout chunk (state=3): >>> <<< 18699 1726882328.73707: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7835e50> <<< 18699 1726882328.73738: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py<<< 18699 1726882328.73748: stdout chunk (state=3): >>> <<< 18699 1726882328.73787: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc'<<< 18699 1726882328.73812: stdout chunk (state=3): >>> <<< 18699 1726882328.73824: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 18699 1726882328.73853: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 18699 1726882328.73883: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7862e40><<< 18699 1726882328.73886: stdout chunk (state=3): >>> <<< 18699 1726882328.73952: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7861b80> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7a96c90><<< 18699 1726882328.73956: stdout chunk (state=3): >>> <<< 18699 1726882328.74138: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 18699 1726882328.74349: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 18699 1726882328.74356: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a788b1d0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc'<<< 18699 1726882328.74379: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 18699 1726882328.74480: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a78af530> <<< 18699 1726882328.74522: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 18699 1726882328.74601: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 18699 1726882328.74725: stdout chunk (state=3): >>>import 'ntpath' # <<< 18699 1726882328.74807: stdout chunk (state=3): >>> # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc'<<< 18699 1726882328.74845: stdout chunk (state=3): >>> import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7910320> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 18699 1726882328.74880: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 18699 1726882328.74924: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 18699 1726882328.75042: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 18699 1726882328.75109: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7912a80> <<< 18699 1726882328.75230: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7910440> <<< 18699 1726882328.75298: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a78d5340> <<< 18699 1726882328.75323: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7711430> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a78ae330> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7863da0> <<< 18699 1726882328.75511: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fe7a77116d0> <<< 18699 1726882328.75677: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_jpc839i9/ansible_stat_payload.zip' # zipimport: zlib available <<< 18699 1726882328.75916: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 18699 1726882328.75945: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 18699 1726882328.75973: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 18699 1726882328.76118: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py <<< 18699 1726882328.76136: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7767140> import '_typing' # <<< 18699 1726882328.76388: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7746030> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a77451c0> <<< 18699 1726882328.76427: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882328.76430: stdout chunk (state=3): >>>import 'ansible' # <<< 18699 1726882328.76473: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 18699 1726882328.76477: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882328.76504: stdout chunk (state=3): >>>import 'ansible.module_utils' # # zipimport: zlib available <<< 18699 1726882328.78864: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882328.80411: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' <<< 18699 1726882328.80444: stdout chunk (state=3): >>>import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7765430> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 18699 1726882328.80471: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 18699 1726882328.80512: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7a778eb10> <<< 18699 1726882328.80550: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a778e8a0> <<< 18699 1726882328.80615: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a778e1b0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 18699 1726882328.80725: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a778ec30> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7767dd0> import 'atexit' # <<< 18699 1726882328.80824: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7a778f860> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7a778faa0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # <<< 18699 1726882328.80870: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a778ffb0> <<< 18699 1726882328.80912: stdout chunk (state=3): >>>import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 18699 1726882328.80959: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 18699 1726882328.81058: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a710dd00> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7a710f920> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 18699 1726882328.81101: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a71102f0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 18699 1726882328.81133: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 18699 1726882328.81173: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7111490> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 18699 1726882328.81282: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 18699 1726882328.81305: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7113f20> <<< 18699 1726882328.81345: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7a7a02e70> <<< 18699 1726882328.81389: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a71121e0> <<< 18699 1726882328.81416: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 18699 1726882328.81479: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 18699 1726882328.81600: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a711be60> <<< 18699 1726882328.81646: stdout chunk (state=3): >>>import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a711a930> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a711a690> <<< 18699 1726882328.81657: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 18699 1726882328.81756: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a711ac00> <<< 18699 1726882328.81789: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a71126f0> <<< 18699 1726882328.81834: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' <<< 18699 1726882328.81856: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7a7163f50> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7164230> <<< 18699 1726882328.81889: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 18699 1726882328.82016: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7a7165ca0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7165a60> <<< 18699 1726882328.82036: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 18699 1726882328.82116: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 18699 1726882328.82153: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 18699 1726882328.82188: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7a7168230> <<< 18699 1726882328.82201: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7166390> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 18699 1726882328.82241: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 18699 1726882328.82301: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 18699 1726882328.82318: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 18699 1726882328.82349: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a716b980> <<< 18699 1726882328.82560: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7168380> <<< 18699 1726882328.82704: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7a716c770> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so'<<< 18699 1726882328.82707: stdout chunk (state=3): >>> import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7a716cbf0><<< 18699 1726882328.82783: stdout chunk (state=3): >>> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so'<<< 18699 1726882328.82804: stdout chunk (state=3): >>> # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7a716cc80> <<< 18699 1726882328.82911: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a71643b0> <<< 18699 1726882328.82914: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 18699 1726882328.82972: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so'<<< 18699 1726882328.83012: stdout chunk (state=3): >>> # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so'<<< 18699 1726882328.83045: stdout chunk (state=3): >>> <<< 18699 1726882328.83339: stdout chunk (state=3): >>>import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7a71f8380> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7a71f9610> <<< 18699 1726882328.83361: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a716eb10> <<< 18699 1726882328.83403: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so'<<< 18699 1726882328.83433: stdout chunk (state=3): >>> # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' <<< 18699 1726882328.83543: stdout chunk (state=3): >>>import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7a716fec0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a716e750> <<< 18699 1726882328.83573: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 18699 1726882328.83706: stdout chunk (state=3): >>># zipimport: zlib available<<< 18699 1726882328.83814: stdout chunk (state=3): >>> <<< 18699 1726882328.83850: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882328.83903: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # <<< 18699 1726882328.84018: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 18699 1726882328.84192: stdout chunk (state=3): >>># zipimport: zlib available<<< 18699 1726882328.84232: stdout chunk (state=3): >>> <<< 18699 1726882328.84413: stdout chunk (state=3): >>># zipimport: zlib available<<< 18699 1726882328.84530: stdout chunk (state=3): >>> <<< 18699 1726882328.85399: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882328.86292: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # <<< 18699 1726882328.86316: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.converters' # <<< 18699 1726882328.86357: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py<<< 18699 1726882328.86631: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 18699 1726882328.86635: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7a71fd790> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a71fe6c0> <<< 18699 1726882328.86646: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a71f9880><<< 18699 1726882328.86714: stdout chunk (state=3): >>> import 'ansible.module_utils.compat.selinux' # <<< 18699 1726882328.86748: stdout chunk (state=3): >>># zipimport: zlib available<<< 18699 1726882328.86763: stdout chunk (state=3): >>> <<< 18699 1726882328.86790: stdout chunk (state=3): >>># zipimport: zlib available<<< 18699 1726882328.86847: stdout chunk (state=3): >>> import 'ansible.module_utils._text' # # zipimport: zlib available <<< 18699 1726882328.87214: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882328.87316: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py<<< 18699 1726882328.87395: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a71fe5a0> # zipimport: zlib available <<< 18699 1726882328.88153: stdout chunk (state=3): >>># zipimport: zlib available<<< 18699 1726882328.88172: stdout chunk (state=3): >>> <<< 18699 1726882328.88912: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882328.89023: stdout chunk (state=3): >>># zipimport: zlib available<<< 18699 1726882328.89045: stdout chunk (state=3): >>> <<< 18699 1726882328.89147: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 18699 1726882328.89168: stdout chunk (state=3): >>># zipimport: zlib available<<< 18699 1726882328.89185: stdout chunk (state=3): >>> <<< 18699 1726882328.89234: stdout chunk (state=3): >>># zipimport: zlib available<<< 18699 1726882328.89255: stdout chunk (state=3): >>> <<< 18699 1726882328.89342: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # # zipimport: zlib available<<< 18699 1726882328.89417: stdout chunk (state=3): >>> # zipimport: zlib available <<< 18699 1726882328.89555: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 18699 1726882328.89590: stdout chunk (state=3): >>> # zipimport: zlib available <<< 18699 1726882328.89618: stdout chunk (state=3): >>># zipimport: zlib available<<< 18699 1726882328.89684: stdout chunk (state=3): >>> import 'ansible.module_utils.parsing' # <<< 18699 1726882328.89773: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882328.89776: stdout chunk (state=3): >>># zipimport: zlib available<<< 18699 1726882328.89820: stdout chunk (state=3): >>> import 'ansible.module_utils.parsing.convert_bool' # <<< 18699 1726882328.89851: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882328.90245: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882328.90750: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # <<< 18699 1726882328.90857: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a71ff800> # zipimport: zlib available <<< 18699 1726882328.90972: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882328.91079: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 18699 1726882328.91108: stdout chunk (state=3): >>> import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 18699 1726882328.91121: stdout chunk (state=3): >>> import 'ansible.module_utils.common.arg_spec' # <<< 18699 1726882328.91151: stdout chunk (state=3): >>> # zipimport: zlib available <<< 18699 1726882328.91220: stdout chunk (state=3): >>># zipimport: zlib available<<< 18699 1726882328.91296: stdout chunk (state=3): >>> import 'ansible.module_utils.common.locale' # <<< 18699 1726882328.91325: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882328.91406: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882328.91569: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available<<< 18699 1726882328.91582: stdout chunk (state=3): >>> <<< 18699 1726882328.91683: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py<<< 18699 1726882328.91695: stdout chunk (state=3): >>> <<< 18699 1726882328.91752: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 18699 1726882328.91867: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so'<<< 18699 1726882328.91929: stdout chunk (state=3): >>> # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' <<< 18699 1726882328.92054: stdout chunk (state=3): >>>import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7a700a150> <<< 18699 1726882328.92068: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a70074a0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 18699 1726882328.92175: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available<<< 18699 1726882328.92178: stdout chunk (state=3): >>> <<< 18699 1726882328.92274: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882328.92321: stdout chunk (state=3): >>># zipimport: zlib available<<< 18699 1726882328.92332: stdout chunk (state=3): >>> <<< 18699 1726882328.92401: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc'<<< 18699 1726882328.92422: stdout chunk (state=3): >>> <<< 18699 1726882328.92449: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 18699 1726882328.92497: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 18699 1726882328.92722: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 18699 1726882328.92757: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a77eaa80><<< 18699 1726882328.92777: stdout chunk (state=3): >>> <<< 18699 1726882328.92832: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a77da750><<< 18699 1726882328.92850: stdout chunk (state=3): >>> <<< 18699 1726882328.92937: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a700a270><<< 18699 1726882328.92959: stdout chunk (state=3): >>> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a71ff170> # destroy ansible.module_utils.distro<<< 18699 1726882328.92985: stdout chunk (state=3): >>> import 'ansible.module_utils.distro' # # zipimport: zlib available<<< 18699 1726882328.93040: stdout chunk (state=3): >>> # zipimport: zlib available<<< 18699 1726882328.93055: stdout chunk (state=3): >>> <<< 18699 1726882328.93099: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # <<< 18699 1726882328.93152: stdout chunk (state=3): >>>import 'ansible.module_utils.common.sys_info' # <<< 18699 1726882328.93198: stdout chunk (state=3): >>> import 'ansible.module_utils.basic' # <<< 18699 1726882328.93210: stdout chunk (state=3): >>> <<< 18699 1726882328.93278: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 18699 1726882328.93377: stdout chunk (state=3): >>>import 'ansible.modules' # <<< 18699 1726882328.93391: stdout chunk (state=3): >>># zipimport: zlib available <<< 18699 1726882328.93542: stdout chunk (state=3): >>># zipimport: zlib available<<< 18699 1726882328.93715: stdout chunk (state=3): >>> <<< 18699 1726882328.93864: stdout chunk (state=3): >>># zipimport: zlib available<<< 18699 1726882328.93883: stdout chunk (state=3): >>> <<< 18699 1726882328.94132: stdout chunk (state=3): >>> <<< 18699 1726882328.94135: stdout chunk (state=3): >>>{"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 18699 1726882328.94155: stdout chunk (state=3): >>># destroy __main__ <<< 18699 1726882328.94504: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 18699 1726882328.94555: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._<<< 18699 1726882328.94695: stdout chunk (state=3): >>> # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc <<< 18699 1726882328.94701: stdout chunk (state=3): >>># clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc<<< 18699 1726882328.94734: stdout chunk (state=3): >>> # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword<<< 18699 1726882328.94744: stdout chunk (state=3): >>> # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools<<< 18699 1726882328.94885: stdout chunk (state=3): >>> # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg<<< 18699 1726882328.94892: stdout chunk (state=3): >>> # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery<<< 18699 1726882328.94911: stdout chunk (state=3): >>> # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex<<< 18699 1726882328.94942: stdout chunk (state=3): >>> # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid <<< 18699 1726882328.94965: stdout chunk (state=3): >>># cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket<<< 18699 1726882328.95005: stdout chunk (state=3): >>> # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes <<< 18699 1726882328.95026: stdout chunk (state=3): >>># cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors<<< 18699 1726882328.95326: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 18699 1726882328.95404: stdout chunk (state=3): >>># destroy _sitebuiltins<<< 18699 1726882328.95437: stdout chunk (state=3): >>> <<< 18699 1726882328.95460: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 18699 1726882328.95491: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression<<< 18699 1726882328.95513: stdout chunk (state=3): >>> # destroy _lzma # destroy _blake2<<< 18699 1726882328.95537: stdout chunk (state=3): >>> # destroy binascii <<< 18699 1726882328.95563: stdout chunk (state=3): >>># destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile<<< 18699 1726882328.95574: stdout chunk (state=3): >>> # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch<<< 18699 1726882328.95631: stdout chunk (state=3): >>> # destroy ipaddress # destroy ntpath<<< 18699 1726882328.95662: stdout chunk (state=3): >>> <<< 18699 1726882328.95688: stdout chunk (state=3): >>># destroy importlib # destroy zipimport <<< 18699 1726882328.95723: stdout chunk (state=3): >>># destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings<<< 18699 1726882328.95803: stdout chunk (state=3): >>> # destroy _locale<<< 18699 1726882328.95806: stdout chunk (state=3): >>> # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog<<< 18699 1726882328.95843: stdout chunk (state=3): >>> # destroy uuid # destroy selectors<<< 18699 1726882328.95846: stdout chunk (state=3): >>> # destroy errno # destroy array<<< 18699 1726882328.95888: stdout chunk (state=3): >>> # destroy datetime<<< 18699 1726882328.95890: stdout chunk (state=3): >>> # destroy selinux<<< 18699 1726882328.95972: stdout chunk (state=3): >>> # destroy shutil # destroy distro<<< 18699 1726882328.95975: stdout chunk (state=3): >>> # destroy distro.distro # destroy argparse # destroy json # destroy logging<<< 18699 1726882328.96004: stdout chunk (state=3): >>> # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian<<< 18699 1726882328.96024: stdout chunk (state=3): >>> # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc<<< 18699 1726882328.96048: stdout chunk (state=3): >>> # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal<<< 18699 1726882328.96083: stdout chunk (state=3): >>> # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize<<< 18699 1726882328.96120: stdout chunk (state=3): >>> # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc<<< 18699 1726882328.96145: stdout chunk (state=3): >>> # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect<<< 18699 1726882328.96174: stdout chunk (state=3): >>> # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re<<< 18699 1726882328.96248: stdout chunk (state=3): >>> # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools<<< 18699 1726882328.96251: stdout chunk (state=3): >>> # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig<<< 18699 1726882328.96254: stdout chunk (state=3): >>> # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat<<< 18699 1726882328.96328: stdout chunk (state=3): >>> # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal<<< 18699 1726882328.96333: stdout chunk (state=3): >>> # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp<<< 18699 1726882328.96336: stdout chunk (state=3): >>> # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon <<< 18699 1726882328.96564: stdout chunk (state=3): >>># destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring <<< 18699 1726882328.96568: stdout chunk (state=3): >>># destroy _socket <<< 18699 1726882328.96607: stdout chunk (state=3): >>># destroy _collections # destroy platform # destroy _uuid <<< 18699 1726882328.96632: stdout chunk (state=3): >>># destroy stat # destroy genericpath <<< 18699 1726882328.96767: stdout chunk (state=3): >>># destroy re._parser # destroy tokenize <<< 18699 1726882328.96771: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize <<< 18699 1726882328.96806: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response <<< 18699 1726882328.96809: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external<<< 18699 1726882328.96830: stdout chunk (state=3): >>> # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path<<< 18699 1726882328.96918: stdout chunk (state=3): >>> # clear sys.modules # destroy _frozen_importlib <<< 18699 1726882328.97008: stdout chunk (state=3): >>># destroy codecs <<< 18699 1726882328.97020: stdout chunk (state=3): >>># destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io<<< 18699 1726882328.97065: stdout chunk (state=3): >>> # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings<<< 18699 1726882328.97098: stdout chunk (state=3): >>> # destroy math # destroy _bisect # destroy time<<< 18699 1726882328.97125: stdout chunk (state=3): >>> # destroy _random # destroy _weakref <<< 18699 1726882328.97234: stdout chunk (state=3): >>># destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc<<< 18699 1726882328.97255: stdout chunk (state=3): >>> # destroy _sre # destroy posix <<< 18699 1726882328.97309: stdout chunk (state=3): >>># destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks<<< 18699 1726882328.97535: stdout chunk (state=3): >>> <<< 18699 1726882328.98063: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 18699 1726882328.98067: stdout chunk (state=3): >>><<< 18699 1726882328.98069: stderr chunk (state=3): >>><<< 18699 1726882328.98080: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7c184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7be7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7c1aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a79c9130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a79c9fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7a07e30> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7a07ef0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7a3f860> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7a3fef0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7a1fb00> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7a1d220> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7a04fe0> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7a5f800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7a5e420> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7a1e0f0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7a5cc80> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7a94830> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7a04260> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7a7a94ce0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7a94b90> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7a7a94f80> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7a02d80> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7a95670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7a95340> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7a96570> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7aac770> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7a7aade50> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7aaecf0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7a7aaf320> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7aae240> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7a7aafda0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7aaf4d0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7a964e0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7a7837cb0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7a7860710> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7860470> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7a7860740> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7a7861070> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7a7861a60> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7860920> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7835e50> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7862e40> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7861b80> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7a96c90> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a788b1d0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a78af530> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7910320> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7912a80> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7910440> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a78d5340> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7711430> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a78ae330> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7863da0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fe7a77116d0> # zipimport: found 30 names in '/tmp/ansible_stat_payload_jpc839i9/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7767140> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7746030> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a77451c0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7765430> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7a778eb10> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a778e8a0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a778e1b0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a778ec30> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7767dd0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7a778f860> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7a778faa0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a778ffb0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a710dd00> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7a710f920> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a71102f0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7111490> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7113f20> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7a7a02e70> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a71121e0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a711be60> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a711a930> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a711a690> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a711ac00> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a71126f0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7a7163f50> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7164230> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7a7165ca0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7165a60> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7a7168230> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7166390> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a716b980> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a7168380> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7a716c770> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7a716cbf0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7a716cc80> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a71643b0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7a71f8380> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7a71f9610> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a716eb10> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7a716fec0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a716e750> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7a71fd790> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a71fe6c0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a71f9880> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a71fe5a0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a71ff800> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe7a700a150> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a70074a0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a77eaa80> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a77da750> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a700a270> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe7a71ff170> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 18699 1726882328.99227: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882328.4289849-18827-78238000143948/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18699 1726882328.99230: _low_level_execute_command(): starting 18699 1726882328.99233: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882328.4289849-18827-78238000143948/ > /dev/null 2>&1 && sleep 0' 18699 1726882328.99686: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882328.99828: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882328.99832: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882328.99873: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882329.00025: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 18699 1726882329.02546: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882329.02550: stdout chunk (state=3): >>><<< 18699 1726882329.02556: stderr chunk (state=3): >>><<< 18699 1726882329.02614: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 18699 1726882329.02620: handler run complete 18699 1726882329.02639: attempt loop complete, returning result 18699 1726882329.02642: _execute() done 18699 1726882329.02644: dumping result to json 18699 1726882329.02695: done dumping result, returning 18699 1726882329.02699: done running TaskExecutor() for managed_node1/TASK: Check if system is ostree [12673a56-9f93-1ce6-d207-00000000008f] 18699 1726882329.02702: sending task result for task 12673a56-9f93-1ce6-d207-00000000008f ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 18699 1726882329.02864: no more pending results, returning what we have 18699 1726882329.02867: results queue empty 18699 1726882329.02868: checking for any_errors_fatal 18699 1726882329.02873: done checking for any_errors_fatal 18699 1726882329.02874: checking for max_fail_percentage 18699 1726882329.02876: done checking for max_fail_percentage 18699 1726882329.02877: checking to see if all hosts have failed and the running result is not ok 18699 1726882329.02878: done checking to see if all hosts have failed 18699 1726882329.02878: getting the remaining hosts for this loop 18699 1726882329.02880: done getting the remaining hosts for this loop 18699 1726882329.02884: getting the next task for host managed_node1 18699 1726882329.02889: done getting next task for host managed_node1 18699 1726882329.02892: ^ task is: TASK: Set flag to indicate system is ostree 18699 1726882329.02897: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882329.02900: getting variables 18699 1726882329.02902: in VariableManager get_vars() 18699 1726882329.02937: Calling all_inventory to load vars for managed_node1 18699 1726882329.02940: Calling groups_inventory to load vars for managed_node1 18699 1726882329.02944: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882329.02955: Calling all_plugins_play to load vars for managed_node1 18699 1726882329.02959: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882329.02961: Calling groups_plugins_play to load vars for managed_node1 18699 1726882329.03166: done sending task result for task 12673a56-9f93-1ce6-d207-00000000008f 18699 1726882329.03169: WORKER PROCESS EXITING 18699 1726882329.03190: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882329.03387: done with get_vars() 18699 1726882329.03399: done getting variables 18699 1726882329.03489: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Friday 20 September 2024 21:32:09 -0400 (0:00:00.673) 0:00:02.631 ****** 18699 1726882329.03518: entering _queue_task() for managed_node1/set_fact 18699 1726882329.03520: Creating lock for set_fact 18699 1726882329.04241: worker is 1 (out of 1 available) 18699 1726882329.04252: exiting _queue_task() for managed_node1/set_fact 18699 1726882329.04263: done queuing things up, now waiting for results queue to drain 18699 1726882329.04264: waiting for pending results... 18699 1726882329.04439: running TaskExecutor() for managed_node1/TASK: Set flag to indicate system is ostree 18699 1726882329.04928: in run() - task 12673a56-9f93-1ce6-d207-000000000090 18699 1726882329.04931: variable 'ansible_search_path' from source: unknown 18699 1726882329.04934: variable 'ansible_search_path' from source: unknown 18699 1726882329.04937: calling self._execute() 18699 1726882329.04964: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882329.04976: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882329.04989: variable 'omit' from source: magic vars 18699 1726882329.05972: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18699 1726882329.06467: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18699 1726882329.06521: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18699 1726882329.06558: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18699 1726882329.06598: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18699 1726882329.06685: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18699 1726882329.06723: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18699 1726882329.06751: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882329.06782: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18699 1726882329.06906: Evaluated conditional (not __network_is_ostree is defined): True 18699 1726882329.06921: variable 'omit' from source: magic vars 18699 1726882329.06961: variable 'omit' from source: magic vars 18699 1726882329.07079: variable '__ostree_booted_stat' from source: set_fact 18699 1726882329.07135: variable 'omit' from source: magic vars 18699 1726882329.07168: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18699 1726882329.07202: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18699 1726882329.07225: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18699 1726882329.07242: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882329.07261: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882329.07297: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18699 1726882329.07308: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882329.07378: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882329.07468: Set connection var ansible_connection to ssh 18699 1726882329.07481: Set connection var ansible_pipelining to False 18699 1726882329.07490: Set connection var ansible_shell_executable to /bin/sh 18699 1726882329.07538: Set connection var ansible_timeout to 10 18699 1726882329.07540: Set connection var ansible_shell_type to sh 18699 1726882329.07542: Set connection var ansible_module_compression to ZIP_DEFLATED 18699 1726882329.07686: variable 'ansible_shell_executable' from source: unknown 18699 1726882329.07690: variable 'ansible_connection' from source: unknown 18699 1726882329.07692: variable 'ansible_module_compression' from source: unknown 18699 1726882329.07698: variable 'ansible_shell_type' from source: unknown 18699 1726882329.07701: variable 'ansible_shell_executable' from source: unknown 18699 1726882329.07703: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882329.07705: variable 'ansible_pipelining' from source: unknown 18699 1726882329.07707: variable 'ansible_timeout' from source: unknown 18699 1726882329.07709: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882329.07737: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18699 1726882329.07754: variable 'omit' from source: magic vars 18699 1726882329.07783: starting attempt loop 18699 1726882329.07801: running the handler 18699 1726882329.07822: handler run complete 18699 1726882329.07858: attempt loop complete, returning result 18699 1726882329.07865: _execute() done 18699 1726882329.07872: dumping result to json 18699 1726882329.07879: done dumping result, returning 18699 1726882329.07890: done running TaskExecutor() for managed_node1/TASK: Set flag to indicate system is ostree [12673a56-9f93-1ce6-d207-000000000090] 18699 1726882329.07906: sending task result for task 12673a56-9f93-1ce6-d207-000000000090 ok: [managed_node1] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 18699 1726882329.08150: no more pending results, returning what we have 18699 1726882329.08153: results queue empty 18699 1726882329.08154: checking for any_errors_fatal 18699 1726882329.08159: done checking for any_errors_fatal 18699 1726882329.08160: checking for max_fail_percentage 18699 1726882329.08162: done checking for max_fail_percentage 18699 1726882329.08162: checking to see if all hosts have failed and the running result is not ok 18699 1726882329.08163: done checking to see if all hosts have failed 18699 1726882329.08164: getting the remaining hosts for this loop 18699 1726882329.08165: done getting the remaining hosts for this loop 18699 1726882329.08169: getting the next task for host managed_node1 18699 1726882329.08204: done getting next task for host managed_node1 18699 1726882329.08208: ^ task is: TASK: Fix CentOS6 Base repo 18699 1726882329.08210: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882329.08215: getting variables 18699 1726882329.08217: in VariableManager get_vars() 18699 1726882329.08351: Calling all_inventory to load vars for managed_node1 18699 1726882329.08360: Calling groups_inventory to load vars for managed_node1 18699 1726882329.08364: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882329.08376: Calling all_plugins_play to load vars for managed_node1 18699 1726882329.08379: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882329.08382: Calling groups_plugins_play to load vars for managed_node1 18699 1726882329.08762: done sending task result for task 12673a56-9f93-1ce6-d207-000000000090 18699 1726882329.08771: WORKER PROCESS EXITING 18699 1726882329.08798: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882329.08998: done with get_vars() 18699 1726882329.09009: done getting variables 18699 1726882329.09127: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Friday 20 September 2024 21:32:09 -0400 (0:00:00.056) 0:00:02.687 ****** 18699 1726882329.09154: entering _queue_task() for managed_node1/copy 18699 1726882329.09625: worker is 1 (out of 1 available) 18699 1726882329.09635: exiting _queue_task() for managed_node1/copy 18699 1726882329.09643: done queuing things up, now waiting for results queue to drain 18699 1726882329.09644: waiting for pending results... 18699 1726882329.10011: running TaskExecutor() for managed_node1/TASK: Fix CentOS6 Base repo 18699 1726882329.10016: in run() - task 12673a56-9f93-1ce6-d207-000000000092 18699 1726882329.10101: variable 'ansible_search_path' from source: unknown 18699 1726882329.10104: variable 'ansible_search_path' from source: unknown 18699 1726882329.10107: calling self._execute() 18699 1726882329.10178: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882329.10189: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882329.10209: variable 'omit' from source: magic vars 18699 1726882329.10692: variable 'ansible_distribution' from source: facts 18699 1726882329.10722: Evaluated conditional (ansible_distribution == 'CentOS'): True 18699 1726882329.10843: variable 'ansible_distribution_major_version' from source: facts 18699 1726882329.10856: Evaluated conditional (ansible_distribution_major_version == '6'): False 18699 1726882329.10864: when evaluation is False, skipping this task 18699 1726882329.10871: _execute() done 18699 1726882329.10878: dumping result to json 18699 1726882329.10885: done dumping result, returning 18699 1726882329.10904: done running TaskExecutor() for managed_node1/TASK: Fix CentOS6 Base repo [12673a56-9f93-1ce6-d207-000000000092] 18699 1726882329.10916: sending task result for task 12673a56-9f93-1ce6-d207-000000000092 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 18699 1726882329.11271: no more pending results, returning what we have 18699 1726882329.11274: results queue empty 18699 1726882329.11275: checking for any_errors_fatal 18699 1726882329.11280: done checking for any_errors_fatal 18699 1726882329.11281: checking for max_fail_percentage 18699 1726882329.11282: done checking for max_fail_percentage 18699 1726882329.11284: checking to see if all hosts have failed and the running result is not ok 18699 1726882329.11285: done checking to see if all hosts have failed 18699 1726882329.11285: getting the remaining hosts for this loop 18699 1726882329.11287: done getting the remaining hosts for this loop 18699 1726882329.11291: getting the next task for host managed_node1 18699 1726882329.11305: done getting next task for host managed_node1 18699 1726882329.11308: ^ task is: TASK: Include the task 'enable_epel.yml' 18699 1726882329.11312: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882329.11317: getting variables 18699 1726882329.11318: in VariableManager get_vars() 18699 1726882329.11349: Calling all_inventory to load vars for managed_node1 18699 1726882329.11352: Calling groups_inventory to load vars for managed_node1 18699 1726882329.11356: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882329.11485: Calling all_plugins_play to load vars for managed_node1 18699 1726882329.11489: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882329.11492: Calling groups_plugins_play to load vars for managed_node1 18699 1726882329.11509: done sending task result for task 12673a56-9f93-1ce6-d207-000000000092 18699 1726882329.11512: WORKER PROCESS EXITING 18699 1726882329.11886: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882329.12112: done with get_vars() 18699 1726882329.12123: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Friday 20 September 2024 21:32:09 -0400 (0:00:00.030) 0:00:02.718 ****** 18699 1726882329.12217: entering _queue_task() for managed_node1/include_tasks 18699 1726882329.12482: worker is 1 (out of 1 available) 18699 1726882329.12698: exiting _queue_task() for managed_node1/include_tasks 18699 1726882329.12708: done queuing things up, now waiting for results queue to drain 18699 1726882329.12709: waiting for pending results... 18699 1726882329.12742: running TaskExecutor() for managed_node1/TASK: Include the task 'enable_epel.yml' 18699 1726882329.12853: in run() - task 12673a56-9f93-1ce6-d207-000000000093 18699 1726882329.12869: variable 'ansible_search_path' from source: unknown 18699 1726882329.12909: variable 'ansible_search_path' from source: unknown 18699 1726882329.12936: calling self._execute() 18699 1726882329.13024: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882329.13045: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882329.13054: variable 'omit' from source: magic vars 18699 1726882329.13701: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18699 1726882329.15913: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18699 1726882329.15965: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18699 1726882329.16005: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18699 1726882329.16034: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18699 1726882329.16054: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18699 1726882329.16121: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882329.16142: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882329.16160: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882329.16185: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882329.16201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882329.16286: variable '__network_is_ostree' from source: set_fact 18699 1726882329.16309: Evaluated conditional (not __network_is_ostree | d(false)): True 18699 1726882329.16313: _execute() done 18699 1726882329.16315: dumping result to json 18699 1726882329.16368: done dumping result, returning 18699 1726882329.16397: done running TaskExecutor() for managed_node1/TASK: Include the task 'enable_epel.yml' [12673a56-9f93-1ce6-d207-000000000093] 18699 1726882329.16400: sending task result for task 12673a56-9f93-1ce6-d207-000000000093 18699 1726882329.16469: done sending task result for task 12673a56-9f93-1ce6-d207-000000000093 18699 1726882329.16471: WORKER PROCESS EXITING 18699 1726882329.16516: no more pending results, returning what we have 18699 1726882329.16522: in VariableManager get_vars() 18699 1726882329.16554: Calling all_inventory to load vars for managed_node1 18699 1726882329.16557: Calling groups_inventory to load vars for managed_node1 18699 1726882329.16560: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882329.16570: Calling all_plugins_play to load vars for managed_node1 18699 1726882329.16572: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882329.16575: Calling groups_plugins_play to load vars for managed_node1 18699 1726882329.16788: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882329.16957: done with get_vars() 18699 1726882329.16964: variable 'ansible_search_path' from source: unknown 18699 1726882329.16965: variable 'ansible_search_path' from source: unknown 18699 1726882329.17003: we have included files to process 18699 1726882329.17004: generating all_blocks data 18699 1726882329.17006: done generating all_blocks data 18699 1726882329.17011: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 18699 1726882329.17012: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 18699 1726882329.17015: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 18699 1726882329.17703: done processing included file 18699 1726882329.17706: iterating over new_blocks loaded from include file 18699 1726882329.17709: in VariableManager get_vars() 18699 1726882329.17721: done with get_vars() 18699 1726882329.17723: filtering new block on tags 18699 1726882329.17749: done filtering new block on tags 18699 1726882329.17752: in VariableManager get_vars() 18699 1726882329.17762: done with get_vars() 18699 1726882329.17763: filtering new block on tags 18699 1726882329.17777: done filtering new block on tags 18699 1726882329.17779: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node1 18699 1726882329.17785: extending task lists for all hosts with included blocks 18699 1726882329.18019: done extending task lists 18699 1726882329.18023: done processing included files 18699 1726882329.18024: results queue empty 18699 1726882329.18025: checking for any_errors_fatal 18699 1726882329.18028: done checking for any_errors_fatal 18699 1726882329.18028: checking for max_fail_percentage 18699 1726882329.18029: done checking for max_fail_percentage 18699 1726882329.18030: checking to see if all hosts have failed and the running result is not ok 18699 1726882329.18031: done checking to see if all hosts have failed 18699 1726882329.18032: getting the remaining hosts for this loop 18699 1726882329.18033: done getting the remaining hosts for this loop 18699 1726882329.18035: getting the next task for host managed_node1 18699 1726882329.18039: done getting next task for host managed_node1 18699 1726882329.18043: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 18699 1726882329.18045: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882329.18048: getting variables 18699 1726882329.18048: in VariableManager get_vars() 18699 1726882329.18057: Calling all_inventory to load vars for managed_node1 18699 1726882329.18059: Calling groups_inventory to load vars for managed_node1 18699 1726882329.18061: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882329.18066: Calling all_plugins_play to load vars for managed_node1 18699 1726882329.18074: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882329.18077: Calling groups_plugins_play to load vars for managed_node1 18699 1726882329.18337: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882329.18725: done with get_vars() 18699 1726882329.18734: done getting variables 18699 1726882329.18873: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 18699 1726882329.19114: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 10] ********************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Friday 20 September 2024 21:32:09 -0400 (0:00:00.069) 0:00:02.787 ****** 18699 1726882329.19173: entering _queue_task() for managed_node1/command 18699 1726882329.19175: Creating lock for command 18699 1726882329.19607: worker is 1 (out of 1 available) 18699 1726882329.19620: exiting _queue_task() for managed_node1/command 18699 1726882329.19632: done queuing things up, now waiting for results queue to drain 18699 1726882329.19633: waiting for pending results... 18699 1726882329.19952: running TaskExecutor() for managed_node1/TASK: Create EPEL 10 18699 1726882329.20020: in run() - task 12673a56-9f93-1ce6-d207-0000000000ad 18699 1726882329.20032: variable 'ansible_search_path' from source: unknown 18699 1726882329.20035: variable 'ansible_search_path' from source: unknown 18699 1726882329.20064: calling self._execute() 18699 1726882329.20118: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882329.20122: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882329.20136: variable 'omit' from source: magic vars 18699 1726882329.20397: variable 'ansible_distribution' from source: facts 18699 1726882329.20401: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 18699 1726882329.20488: variable 'ansible_distribution_major_version' from source: facts 18699 1726882329.20491: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 18699 1726882329.20498: when evaluation is False, skipping this task 18699 1726882329.20502: _execute() done 18699 1726882329.20505: dumping result to json 18699 1726882329.20507: done dumping result, returning 18699 1726882329.20513: done running TaskExecutor() for managed_node1/TASK: Create EPEL 10 [12673a56-9f93-1ce6-d207-0000000000ad] 18699 1726882329.20516: sending task result for task 12673a56-9f93-1ce6-d207-0000000000ad 18699 1726882329.20616: done sending task result for task 12673a56-9f93-1ce6-d207-0000000000ad 18699 1726882329.20619: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 18699 1726882329.20672: no more pending results, returning what we have 18699 1726882329.20675: results queue empty 18699 1726882329.20676: checking for any_errors_fatal 18699 1726882329.20677: done checking for any_errors_fatal 18699 1726882329.20677: checking for max_fail_percentage 18699 1726882329.20679: done checking for max_fail_percentage 18699 1726882329.20680: checking to see if all hosts have failed and the running result is not ok 18699 1726882329.20680: done checking to see if all hosts have failed 18699 1726882329.20681: getting the remaining hosts for this loop 18699 1726882329.20682: done getting the remaining hosts for this loop 18699 1726882329.20686: getting the next task for host managed_node1 18699 1726882329.20697: done getting next task for host managed_node1 18699 1726882329.20699: ^ task is: TASK: Install yum-utils package 18699 1726882329.20702: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882329.20711: getting variables 18699 1726882329.20713: in VariableManager get_vars() 18699 1726882329.20738: Calling all_inventory to load vars for managed_node1 18699 1726882329.20740: Calling groups_inventory to load vars for managed_node1 18699 1726882329.20743: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882329.20751: Calling all_plugins_play to load vars for managed_node1 18699 1726882329.20754: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882329.20756: Calling groups_plugins_play to load vars for managed_node1 18699 1726882329.20897: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882329.21016: done with get_vars() 18699 1726882329.21022: done getting variables 18699 1726882329.21090: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Friday 20 September 2024 21:32:09 -0400 (0:00:00.019) 0:00:02.807 ****** 18699 1726882329.21114: entering _queue_task() for managed_node1/package 18699 1726882329.21115: Creating lock for package 18699 1726882329.21315: worker is 1 (out of 1 available) 18699 1726882329.21328: exiting _queue_task() for managed_node1/package 18699 1726882329.21338: done queuing things up, now waiting for results queue to drain 18699 1726882329.21339: waiting for pending results... 18699 1726882329.21565: running TaskExecutor() for managed_node1/TASK: Install yum-utils package 18699 1726882329.21901: in run() - task 12673a56-9f93-1ce6-d207-0000000000ae 18699 1726882329.21905: variable 'ansible_search_path' from source: unknown 18699 1726882329.21907: variable 'ansible_search_path' from source: unknown 18699 1726882329.21909: calling self._execute() 18699 1726882329.21911: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882329.21913: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882329.21916: variable 'omit' from source: magic vars 18699 1726882329.22537: variable 'ansible_distribution' from source: facts 18699 1726882329.22552: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 18699 1726882329.22703: variable 'ansible_distribution_major_version' from source: facts 18699 1726882329.22741: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 18699 1726882329.22748: when evaluation is False, skipping this task 18699 1726882329.22754: _execute() done 18699 1726882329.22760: dumping result to json 18699 1726882329.22766: done dumping result, returning 18699 1726882329.22776: done running TaskExecutor() for managed_node1/TASK: Install yum-utils package [12673a56-9f93-1ce6-d207-0000000000ae] 18699 1726882329.22783: sending task result for task 12673a56-9f93-1ce6-d207-0000000000ae skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 18699 1726882329.22968: no more pending results, returning what we have 18699 1726882329.22971: results queue empty 18699 1726882329.22972: checking for any_errors_fatal 18699 1726882329.22979: done checking for any_errors_fatal 18699 1726882329.22979: checking for max_fail_percentage 18699 1726882329.22981: done checking for max_fail_percentage 18699 1726882329.22982: checking to see if all hosts have failed and the running result is not ok 18699 1726882329.22983: done checking to see if all hosts have failed 18699 1726882329.22983: getting the remaining hosts for this loop 18699 1726882329.22985: done getting the remaining hosts for this loop 18699 1726882329.22989: getting the next task for host managed_node1 18699 1726882329.23001: done getting next task for host managed_node1 18699 1726882329.23004: ^ task is: TASK: Enable EPEL 7 18699 1726882329.23008: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882329.23020: getting variables 18699 1726882329.23023: in VariableManager get_vars() 18699 1726882329.23055: Calling all_inventory to load vars for managed_node1 18699 1726882329.23058: Calling groups_inventory to load vars for managed_node1 18699 1726882329.23062: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882329.23074: Calling all_plugins_play to load vars for managed_node1 18699 1726882329.23078: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882329.23081: Calling groups_plugins_play to load vars for managed_node1 18699 1726882329.23478: done sending task result for task 12673a56-9f93-1ce6-d207-0000000000ae 18699 1726882329.23482: WORKER PROCESS EXITING 18699 1726882329.23507: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882329.23714: done with get_vars() 18699 1726882329.23724: done getting variables 18699 1726882329.23786: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Friday 20 September 2024 21:32:09 -0400 (0:00:00.027) 0:00:02.834 ****** 18699 1726882329.23821: entering _queue_task() for managed_node1/command 18699 1726882329.24059: worker is 1 (out of 1 available) 18699 1726882329.24096: exiting _queue_task() for managed_node1/command 18699 1726882329.24106: done queuing things up, now waiting for results queue to drain 18699 1726882329.24107: waiting for pending results... 18699 1726882329.24244: running TaskExecutor() for managed_node1/TASK: Enable EPEL 7 18699 1726882329.24307: in run() - task 12673a56-9f93-1ce6-d207-0000000000af 18699 1726882329.24322: variable 'ansible_search_path' from source: unknown 18699 1726882329.24328: variable 'ansible_search_path' from source: unknown 18699 1726882329.24350: calling self._execute() 18699 1726882329.24404: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882329.24408: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882329.24417: variable 'omit' from source: magic vars 18699 1726882329.24718: variable 'ansible_distribution' from source: facts 18699 1726882329.24729: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 18699 1726882329.24817: variable 'ansible_distribution_major_version' from source: facts 18699 1726882329.24820: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 18699 1726882329.24825: when evaluation is False, skipping this task 18699 1726882329.24827: _execute() done 18699 1726882329.24834: dumping result to json 18699 1726882329.24837: done dumping result, returning 18699 1726882329.24841: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 7 [12673a56-9f93-1ce6-d207-0000000000af] 18699 1726882329.24846: sending task result for task 12673a56-9f93-1ce6-d207-0000000000af 18699 1726882329.24925: done sending task result for task 12673a56-9f93-1ce6-d207-0000000000af 18699 1726882329.24928: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 18699 1726882329.24971: no more pending results, returning what we have 18699 1726882329.24974: results queue empty 18699 1726882329.24975: checking for any_errors_fatal 18699 1726882329.24981: done checking for any_errors_fatal 18699 1726882329.24981: checking for max_fail_percentage 18699 1726882329.24983: done checking for max_fail_percentage 18699 1726882329.24984: checking to see if all hosts have failed and the running result is not ok 18699 1726882329.24984: done checking to see if all hosts have failed 18699 1726882329.24985: getting the remaining hosts for this loop 18699 1726882329.24986: done getting the remaining hosts for this loop 18699 1726882329.24989: getting the next task for host managed_node1 18699 1726882329.24996: done getting next task for host managed_node1 18699 1726882329.24998: ^ task is: TASK: Enable EPEL 8 18699 1726882329.25001: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882329.25004: getting variables 18699 1726882329.25005: in VariableManager get_vars() 18699 1726882329.25027: Calling all_inventory to load vars for managed_node1 18699 1726882329.25029: Calling groups_inventory to load vars for managed_node1 18699 1726882329.25032: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882329.25039: Calling all_plugins_play to load vars for managed_node1 18699 1726882329.25042: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882329.25044: Calling groups_plugins_play to load vars for managed_node1 18699 1726882329.25169: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882329.25279: done with get_vars() 18699 1726882329.25286: done getting variables 18699 1726882329.25323: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Friday 20 September 2024 21:32:09 -0400 (0:00:00.015) 0:00:02.849 ****** 18699 1726882329.25343: entering _queue_task() for managed_node1/command 18699 1726882329.25505: worker is 1 (out of 1 available) 18699 1726882329.25516: exiting _queue_task() for managed_node1/command 18699 1726882329.25526: done queuing things up, now waiting for results queue to drain 18699 1726882329.25527: waiting for pending results... 18699 1726882329.25665: running TaskExecutor() for managed_node1/TASK: Enable EPEL 8 18699 1726882329.25726: in run() - task 12673a56-9f93-1ce6-d207-0000000000b0 18699 1726882329.25734: variable 'ansible_search_path' from source: unknown 18699 1726882329.25737: variable 'ansible_search_path' from source: unknown 18699 1726882329.25764: calling self._execute() 18699 1726882329.25834: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882329.26000: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882329.26004: variable 'omit' from source: magic vars 18699 1726882329.26216: variable 'ansible_distribution' from source: facts 18699 1726882329.26235: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 18699 1726882329.26370: variable 'ansible_distribution_major_version' from source: facts 18699 1726882329.26380: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 18699 1726882329.26387: when evaluation is False, skipping this task 18699 1726882329.26397: _execute() done 18699 1726882329.26410: dumping result to json 18699 1726882329.26425: done dumping result, returning 18699 1726882329.26439: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 8 [12673a56-9f93-1ce6-d207-0000000000b0] 18699 1726882329.26449: sending task result for task 12673a56-9f93-1ce6-d207-0000000000b0 18699 1726882329.26652: done sending task result for task 12673a56-9f93-1ce6-d207-0000000000b0 18699 1726882329.26655: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 18699 1726882329.26692: no more pending results, returning what we have 18699 1726882329.26698: results queue empty 18699 1726882329.26699: checking for any_errors_fatal 18699 1726882329.26702: done checking for any_errors_fatal 18699 1726882329.26703: checking for max_fail_percentage 18699 1726882329.26704: done checking for max_fail_percentage 18699 1726882329.26705: checking to see if all hosts have failed and the running result is not ok 18699 1726882329.26706: done checking to see if all hosts have failed 18699 1726882329.26706: getting the remaining hosts for this loop 18699 1726882329.26707: done getting the remaining hosts for this loop 18699 1726882329.26710: getting the next task for host managed_node1 18699 1726882329.26717: done getting next task for host managed_node1 18699 1726882329.26720: ^ task is: TASK: Enable EPEL 6 18699 1726882329.26723: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882329.26726: getting variables 18699 1726882329.26727: in VariableManager get_vars() 18699 1726882329.26750: Calling all_inventory to load vars for managed_node1 18699 1726882329.26753: Calling groups_inventory to load vars for managed_node1 18699 1726882329.26756: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882329.26773: Calling all_plugins_play to load vars for managed_node1 18699 1726882329.26777: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882329.26784: Calling groups_plugins_play to load vars for managed_node1 18699 1726882329.26913: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882329.27026: done with get_vars() 18699 1726882329.27032: done getting variables 18699 1726882329.27072: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Friday 20 September 2024 21:32:09 -0400 (0:00:00.017) 0:00:02.867 ****** 18699 1726882329.27101: entering _queue_task() for managed_node1/copy 18699 1726882329.27261: worker is 1 (out of 1 available) 18699 1726882329.27272: exiting _queue_task() for managed_node1/copy 18699 1726882329.27282: done queuing things up, now waiting for results queue to drain 18699 1726882329.27283: waiting for pending results... 18699 1726882329.27417: running TaskExecutor() for managed_node1/TASK: Enable EPEL 6 18699 1726882329.27474: in run() - task 12673a56-9f93-1ce6-d207-0000000000b2 18699 1726882329.27483: variable 'ansible_search_path' from source: unknown 18699 1726882329.27486: variable 'ansible_search_path' from source: unknown 18699 1726882329.27517: calling self._execute() 18699 1726882329.27567: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882329.27571: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882329.27579: variable 'omit' from source: magic vars 18699 1726882329.27879: variable 'ansible_distribution' from source: facts 18699 1726882329.27888: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 18699 1726882329.27962: variable 'ansible_distribution_major_version' from source: facts 18699 1726882329.27967: Evaluated conditional (ansible_distribution_major_version == '6'): False 18699 1726882329.27970: when evaluation is False, skipping this task 18699 1726882329.27972: _execute() done 18699 1726882329.27974: dumping result to json 18699 1726882329.27986: done dumping result, returning 18699 1726882329.27989: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 6 [12673a56-9f93-1ce6-d207-0000000000b2] 18699 1726882329.27991: sending task result for task 12673a56-9f93-1ce6-d207-0000000000b2 18699 1726882329.28075: done sending task result for task 12673a56-9f93-1ce6-d207-0000000000b2 18699 1726882329.28078: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 18699 1726882329.28121: no more pending results, returning what we have 18699 1726882329.28124: results queue empty 18699 1726882329.28124: checking for any_errors_fatal 18699 1726882329.28129: done checking for any_errors_fatal 18699 1726882329.28130: checking for max_fail_percentage 18699 1726882329.28131: done checking for max_fail_percentage 18699 1726882329.28132: checking to see if all hosts have failed and the running result is not ok 18699 1726882329.28133: done checking to see if all hosts have failed 18699 1726882329.28133: getting the remaining hosts for this loop 18699 1726882329.28135: done getting the remaining hosts for this loop 18699 1726882329.28137: getting the next task for host managed_node1 18699 1726882329.28144: done getting next task for host managed_node1 18699 1726882329.28146: ^ task is: TASK: Set network provider to 'nm' 18699 1726882329.28148: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882329.28151: getting variables 18699 1726882329.28152: in VariableManager get_vars() 18699 1726882329.28173: Calling all_inventory to load vars for managed_node1 18699 1726882329.28175: Calling groups_inventory to load vars for managed_node1 18699 1726882329.28178: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882329.28185: Calling all_plugins_play to load vars for managed_node1 18699 1726882329.28188: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882329.28190: Calling groups_plugins_play to load vars for managed_node1 18699 1726882329.28319: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882329.28428: done with get_vars() 18699 1726882329.28434: done getting variables 18699 1726882329.28470: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_nm.yml:13 Friday 20 September 2024 21:32:09 -0400 (0:00:00.013) 0:00:02.880 ****** 18699 1726882329.28487: entering _queue_task() for managed_node1/set_fact 18699 1726882329.28646: worker is 1 (out of 1 available) 18699 1726882329.28656: exiting _queue_task() for managed_node1/set_fact 18699 1726882329.28667: done queuing things up, now waiting for results queue to drain 18699 1726882329.28668: waiting for pending results... 18699 1726882329.28921: running TaskExecutor() for managed_node1/TASK: Set network provider to 'nm' 18699 1726882329.28926: in run() - task 12673a56-9f93-1ce6-d207-000000000007 18699 1726882329.28934: variable 'ansible_search_path' from source: unknown 18699 1726882329.28971: calling self._execute() 18699 1726882329.29062: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882329.29074: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882329.29089: variable 'omit' from source: magic vars 18699 1726882329.29211: variable 'omit' from source: magic vars 18699 1726882329.29263: variable 'omit' from source: magic vars 18699 1726882329.29312: variable 'omit' from source: magic vars 18699 1726882329.29372: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18699 1726882329.29419: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18699 1726882329.29461: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18699 1726882329.29485: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882329.29566: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882329.29570: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18699 1726882329.29572: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882329.29575: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882329.29690: Set connection var ansible_connection to ssh 18699 1726882329.29711: Set connection var ansible_pipelining to False 18699 1726882329.29724: Set connection var ansible_shell_executable to /bin/sh 18699 1726882329.29736: Set connection var ansible_timeout to 10 18699 1726882329.29744: Set connection var ansible_shell_type to sh 18699 1726882329.29756: Set connection var ansible_module_compression to ZIP_DEFLATED 18699 1726882329.29806: variable 'ansible_shell_executable' from source: unknown 18699 1726882329.29900: variable 'ansible_connection' from source: unknown 18699 1726882329.29904: variable 'ansible_module_compression' from source: unknown 18699 1726882329.29906: variable 'ansible_shell_type' from source: unknown 18699 1726882329.29908: variable 'ansible_shell_executable' from source: unknown 18699 1726882329.29911: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882329.29913: variable 'ansible_pipelining' from source: unknown 18699 1726882329.29915: variable 'ansible_timeout' from source: unknown 18699 1726882329.29917: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882329.30034: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18699 1726882329.30053: variable 'omit' from source: magic vars 18699 1726882329.30066: starting attempt loop 18699 1726882329.30073: running the handler 18699 1726882329.30091: handler run complete 18699 1726882329.30126: attempt loop complete, returning result 18699 1726882329.30134: _execute() done 18699 1726882329.30143: dumping result to json 18699 1726882329.30151: done dumping result, returning 18699 1726882329.30222: done running TaskExecutor() for managed_node1/TASK: Set network provider to 'nm' [12673a56-9f93-1ce6-d207-000000000007] 18699 1726882329.30226: sending task result for task 12673a56-9f93-1ce6-d207-000000000007 18699 1726882329.30292: done sending task result for task 12673a56-9f93-1ce6-d207-000000000007 18699 1726882329.30299: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 18699 1726882329.30377: no more pending results, returning what we have 18699 1726882329.30381: results queue empty 18699 1726882329.30381: checking for any_errors_fatal 18699 1726882329.30387: done checking for any_errors_fatal 18699 1726882329.30388: checking for max_fail_percentage 18699 1726882329.30390: done checking for max_fail_percentage 18699 1726882329.30391: checking to see if all hosts have failed and the running result is not ok 18699 1726882329.30391: done checking to see if all hosts have failed 18699 1726882329.30392: getting the remaining hosts for this loop 18699 1726882329.30398: done getting the remaining hosts for this loop 18699 1726882329.30402: getting the next task for host managed_node1 18699 1726882329.30411: done getting next task for host managed_node1 18699 1726882329.30414: ^ task is: TASK: meta (flush_handlers) 18699 1726882329.30415: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882329.30419: getting variables 18699 1726882329.30422: in VariableManager get_vars() 18699 1726882329.30450: Calling all_inventory to load vars for managed_node1 18699 1726882329.30453: Calling groups_inventory to load vars for managed_node1 18699 1726882329.30457: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882329.30467: Calling all_plugins_play to load vars for managed_node1 18699 1726882329.30470: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882329.30473: Calling groups_plugins_play to load vars for managed_node1 18699 1726882329.30884: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882329.31331: done with get_vars() 18699 1726882329.31340: done getting variables 18699 1726882329.31424: in VariableManager get_vars() 18699 1726882329.31433: Calling all_inventory to load vars for managed_node1 18699 1726882329.31435: Calling groups_inventory to load vars for managed_node1 18699 1726882329.31438: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882329.31442: Calling all_plugins_play to load vars for managed_node1 18699 1726882329.31444: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882329.31447: Calling groups_plugins_play to load vars for managed_node1 18699 1726882329.31605: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882329.31811: done with get_vars() 18699 1726882329.31825: done queuing things up, now waiting for results queue to drain 18699 1726882329.31827: results queue empty 18699 1726882329.31828: checking for any_errors_fatal 18699 1726882329.31829: done checking for any_errors_fatal 18699 1726882329.31830: checking for max_fail_percentage 18699 1726882329.31831: done checking for max_fail_percentage 18699 1726882329.31832: checking to see if all hosts have failed and the running result is not ok 18699 1726882329.31832: done checking to see if all hosts have failed 18699 1726882329.31833: getting the remaining hosts for this loop 18699 1726882329.31834: done getting the remaining hosts for this loop 18699 1726882329.31836: getting the next task for host managed_node1 18699 1726882329.31840: done getting next task for host managed_node1 18699 1726882329.31841: ^ task is: TASK: meta (flush_handlers) 18699 1726882329.31842: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882329.31849: getting variables 18699 1726882329.31851: in VariableManager get_vars() 18699 1726882329.31858: Calling all_inventory to load vars for managed_node1 18699 1726882329.31860: Calling groups_inventory to load vars for managed_node1 18699 1726882329.31862: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882329.31866: Calling all_plugins_play to load vars for managed_node1 18699 1726882329.31868: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882329.31871: Calling groups_plugins_play to load vars for managed_node1 18699 1726882329.32016: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882329.32248: done with get_vars() 18699 1726882329.32256: done getting variables 18699 1726882329.32310: in VariableManager get_vars() 18699 1726882329.32318: Calling all_inventory to load vars for managed_node1 18699 1726882329.32321: Calling groups_inventory to load vars for managed_node1 18699 1726882329.32323: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882329.32328: Calling all_plugins_play to load vars for managed_node1 18699 1726882329.32330: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882329.32333: Calling groups_plugins_play to load vars for managed_node1 18699 1726882329.32491: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882329.32708: done with get_vars() 18699 1726882329.32720: done queuing things up, now waiting for results queue to drain 18699 1726882329.32722: results queue empty 18699 1726882329.32723: checking for any_errors_fatal 18699 1726882329.32724: done checking for any_errors_fatal 18699 1726882329.32725: checking for max_fail_percentage 18699 1726882329.32726: done checking for max_fail_percentage 18699 1726882329.32727: checking to see if all hosts have failed and the running result is not ok 18699 1726882329.32727: done checking to see if all hosts have failed 18699 1726882329.32728: getting the remaining hosts for this loop 18699 1726882329.32729: done getting the remaining hosts for this loop 18699 1726882329.32731: getting the next task for host managed_node1 18699 1726882329.32734: done getting next task for host managed_node1 18699 1726882329.32735: ^ task is: None 18699 1726882329.32737: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882329.32738: done queuing things up, now waiting for results queue to drain 18699 1726882329.32739: results queue empty 18699 1726882329.32739: checking for any_errors_fatal 18699 1726882329.32740: done checking for any_errors_fatal 18699 1726882329.32741: checking for max_fail_percentage 18699 1726882329.32742: done checking for max_fail_percentage 18699 1726882329.32742: checking to see if all hosts have failed and the running result is not ok 18699 1726882329.32743: done checking to see if all hosts have failed 18699 1726882329.32745: getting the next task for host managed_node1 18699 1726882329.32747: done getting next task for host managed_node1 18699 1726882329.32748: ^ task is: None 18699 1726882329.32749: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882329.32800: in VariableManager get_vars() 18699 1726882329.32816: done with get_vars() 18699 1726882329.32823: in VariableManager get_vars() 18699 1726882329.32832: done with get_vars() 18699 1726882329.32837: variable 'omit' from source: magic vars 18699 1726882329.32868: in VariableManager get_vars() 18699 1726882329.32878: done with get_vars() 18699 1726882329.32903: variable 'omit' from source: magic vars PLAY [Play for showing the network provider] *********************************** 18699 1726882329.33115: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 18699 1726882329.33143: getting the remaining hosts for this loop 18699 1726882329.33144: done getting the remaining hosts for this loop 18699 1726882329.33147: getting the next task for host managed_node1 18699 1726882329.33150: done getting next task for host managed_node1 18699 1726882329.33152: ^ task is: TASK: Gathering Facts 18699 1726882329.33153: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882329.33155: getting variables 18699 1726882329.33156: in VariableManager get_vars() 18699 1726882329.33164: Calling all_inventory to load vars for managed_node1 18699 1726882329.33166: Calling groups_inventory to load vars for managed_node1 18699 1726882329.33168: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882329.33173: Calling all_plugins_play to load vars for managed_node1 18699 1726882329.33186: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882329.33190: Calling groups_plugins_play to load vars for managed_node1 18699 1726882329.33378: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882329.33487: done with get_vars() 18699 1726882329.33492: done getting variables 18699 1726882329.33520: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:3 Friday 20 September 2024 21:32:09 -0400 (0:00:00.050) 0:00:02.931 ****** 18699 1726882329.33536: entering _queue_task() for managed_node1/gather_facts 18699 1726882329.33749: worker is 1 (out of 1 available) 18699 1726882329.33758: exiting _queue_task() for managed_node1/gather_facts 18699 1726882329.33768: done queuing things up, now waiting for results queue to drain 18699 1726882329.33769: waiting for pending results... 18699 1726882329.33923: running TaskExecutor() for managed_node1/TASK: Gathering Facts 18699 1726882329.33979: in run() - task 12673a56-9f93-1ce6-d207-0000000000d8 18699 1726882329.33992: variable 'ansible_search_path' from source: unknown 18699 1726882329.34028: calling self._execute() 18699 1726882329.34083: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882329.34086: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882329.34095: variable 'omit' from source: magic vars 18699 1726882329.34370: variable 'ansible_distribution_major_version' from source: facts 18699 1726882329.34380: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882329.34386: variable 'omit' from source: magic vars 18699 1726882329.34409: variable 'omit' from source: magic vars 18699 1726882329.34444: variable 'omit' from source: magic vars 18699 1726882329.34467: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18699 1726882329.34495: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18699 1726882329.34515: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18699 1726882329.34529: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882329.34540: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882329.34565: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18699 1726882329.34569: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882329.34571: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882329.34639: Set connection var ansible_connection to ssh 18699 1726882329.34645: Set connection var ansible_pipelining to False 18699 1726882329.34652: Set connection var ansible_shell_executable to /bin/sh 18699 1726882329.34656: Set connection var ansible_timeout to 10 18699 1726882329.34659: Set connection var ansible_shell_type to sh 18699 1726882329.34672: Set connection var ansible_module_compression to ZIP_DEFLATED 18699 1726882329.34687: variable 'ansible_shell_executable' from source: unknown 18699 1726882329.34690: variable 'ansible_connection' from source: unknown 18699 1726882329.34692: variable 'ansible_module_compression' from source: unknown 18699 1726882329.34697: variable 'ansible_shell_type' from source: unknown 18699 1726882329.34702: variable 'ansible_shell_executable' from source: unknown 18699 1726882329.34704: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882329.34708: variable 'ansible_pipelining' from source: unknown 18699 1726882329.34711: variable 'ansible_timeout' from source: unknown 18699 1726882329.34715: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882329.34847: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18699 1726882329.34853: variable 'omit' from source: magic vars 18699 1726882329.34858: starting attempt loop 18699 1726882329.34861: running the handler 18699 1726882329.34873: variable 'ansible_facts' from source: unknown 18699 1726882329.34890: _low_level_execute_command(): starting 18699 1726882329.34903: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18699 1726882329.35408: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882329.35412: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882329.35415: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882329.35417: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882329.35462: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882329.35475: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882329.35542: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 18699 1726882329.37768: stdout chunk (state=3): >>>/root <<< 18699 1726882329.37909: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882329.37939: stderr chunk (state=3): >>><<< 18699 1726882329.37942: stdout chunk (state=3): >>><<< 18699 1726882329.37963: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 18699 1726882329.37973: _low_level_execute_command(): starting 18699 1726882329.37978: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882329.3796167-18886-122063217402507 `" && echo ansible-tmp-1726882329.3796167-18886-122063217402507="` echo /root/.ansible/tmp/ansible-tmp-1726882329.3796167-18886-122063217402507 `" ) && sleep 0' 18699 1726882329.38435: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882329.38438: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18699 1726882329.38443: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882329.38453: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 18699 1726882329.38456: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 18699 1726882329.38458: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882329.38503: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882329.38506: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882329.38512: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882329.38558: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 18699 1726882329.41179: stdout chunk (state=3): >>>ansible-tmp-1726882329.3796167-18886-122063217402507=/root/.ansible/tmp/ansible-tmp-1726882329.3796167-18886-122063217402507 <<< 18699 1726882329.41323: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882329.41350: stderr chunk (state=3): >>><<< 18699 1726882329.41354: stdout chunk (state=3): >>><<< 18699 1726882329.41373: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882329.3796167-18886-122063217402507=/root/.ansible/tmp/ansible-tmp-1726882329.3796167-18886-122063217402507 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 18699 1726882329.41404: variable 'ansible_module_compression' from source: unknown 18699 1726882329.41442: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18699f6i6z5dg/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 18699 1726882329.41497: variable 'ansible_facts' from source: unknown 18699 1726882329.41633: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882329.3796167-18886-122063217402507/AnsiballZ_setup.py 18699 1726882329.41738: Sending initial data 18699 1726882329.41741: Sent initial data (154 bytes) 18699 1726882329.42192: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882329.42197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 18699 1726882329.42200: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882329.42202: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882329.42204: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882329.42259: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882329.42265: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882329.42308: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 18699 1726882329.44398: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 18699 1726882329.44407: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18699 1726882329.44441: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18699 1726882329.44496: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmpg94euav2 /root/.ansible/tmp/ansible-tmp-1726882329.3796167-18886-122063217402507/AnsiballZ_setup.py <<< 18699 1726882329.44500: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882329.3796167-18886-122063217402507/AnsiballZ_setup.py" <<< 18699 1726882329.44536: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmpg94euav2" to remote "/root/.ansible/tmp/ansible-tmp-1726882329.3796167-18886-122063217402507/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882329.3796167-18886-122063217402507/AnsiballZ_setup.py" <<< 18699 1726882329.45586: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882329.45623: stderr chunk (state=3): >>><<< 18699 1726882329.45626: stdout chunk (state=3): >>><<< 18699 1726882329.45642: done transferring module to remote 18699 1726882329.45651: _low_level_execute_command(): starting 18699 1726882329.45654: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882329.3796167-18886-122063217402507/ /root/.ansible/tmp/ansible-tmp-1726882329.3796167-18886-122063217402507/AnsiballZ_setup.py && sleep 0' 18699 1726882329.46088: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882329.46091: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18699 1726882329.46096: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882329.46098: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 18699 1726882329.46103: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882329.46106: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882329.46155: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882329.46160: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882329.46204: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 18699 1726882329.48610: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882329.48632: stderr chunk (state=3): >>><<< 18699 1726882329.48636: stdout chunk (state=3): >>><<< 18699 1726882329.48650: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 18699 1726882329.48653: _low_level_execute_command(): starting 18699 1726882329.48655: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882329.3796167-18886-122063217402507/AnsiballZ_setup.py && sleep 0' 18699 1726882329.49085: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882329.49088: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 18699 1726882329.49090: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882329.49095: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882329.49097: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882329.49149: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882329.49152: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882329.49206: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 18699 1726882330.32149: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_fibre_channel_wwn": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-159", "ansible_nodename": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d2d02cced42c36436217cb93f6b8e", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_fips": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC9sgyYGKGPd0JFIDKIZZNkcX78Ca8OmX4GnOCt150Ftpgzzfir9Dy2HOb7d6QbQheoi9HLkHb66U2LDdt7EnBGKnI12YAuydTDfqITc2L4W9cEeoy/f2rrMlBo6FN3SNQc2voCDsWius2gK2mtTTZZI0R33PguMmqTkwYVzP0hYplwSYh5Atl+XP7/xLRhhowanh9U6x2ahqfnNq5DInqi070bKk0xZ2g12Vg8kIRno8ZQmm+ujUUevRkZysHvnrnN01ZQhqzjo/Awn+Pft6LYleTBn+YU/HlPMWR4PsFcrtT3WRdF5samSvVwWuuOC+0td2zQN4nGpYLK+FmpNG4nDfGZV/xIBBblNRvzrhKgk3lDU5qkeQ/R0godRQGbv4J1kq+3WU2E3upqBYxXWUJLM5FirAxz8tKLmaPh8YZWMKcs3X9F2ySLEcnhe5R5F6LFSNx13zQSt7lGZOIgzhvWllcs4YVxcV1Y4rTJ8jEK2KgWua+bZinJPYUJqKTzO2E=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKk0X8hfHP7BSAAI8BDwrr4175ddN6MsanEqlp3oVMOvThKVXLpFXhvJPbq2IBTd3Wm12dL2vAW7/82zG63KYZk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDVN13dHSxa36Blsqt/Q8OyOA04CC7ZlvrS6zWL4aDyE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_pkg_mgr": "dnf", "ansible_is_chroot": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "32", "second": "09", "epoch": "1726882329", "epoch_int": "1726882329", "date": "2024-09-20", "time": "21:32:09", "iso8601_micro": "2024-09-21T01:32:09.909561Z", "iso8601": "2024-09-21T01:32:09Z", "iso8601_basic": "20240920T213209909561", "iso8601_basic_short": "20240920T213209", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TT<<< 18699 1726882330.32219: stdout chunk (state=3): >>>Y": "/dev/pts/1", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 52586 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 52586 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_lsb": {}, "ansible_iscsi_iqn": "", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_local": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_apparmor": {"status": "disabled"}, "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1030:bff:fea1:4223", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::1030:bff:fea1:4223"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1030:bff:fea1:4223"]}, "ansible_loadavg": {"1m": 0.4921875, "5m": 0.3232421875, "15m": 0.15673828125}, "ansible_service_mgr": "systemd", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2934, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 597, "free": 2934}, "nocache": {"free": 3272, "used": 259}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_uuid": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 763, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794553856, "block_size": 4096, "block_total": 65519099, "block_available": 63914686, "block_used": 1604413, "inode_total": 131070960, "inode_available": 131029045, "inode_used": 41915, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 18699 1726882330.35140: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 18699 1726882330.35143: stdout chunk (state=3): >>><<< 18699 1726882330.35146: stderr chunk (state=3): >>><<< 18699 1726882330.35169: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_fibre_channel_wwn": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-159", "ansible_nodename": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d2d02cced42c36436217cb93f6b8e", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_fips": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC9sgyYGKGPd0JFIDKIZZNkcX78Ca8OmX4GnOCt150Ftpgzzfir9Dy2HOb7d6QbQheoi9HLkHb66U2LDdt7EnBGKnI12YAuydTDfqITc2L4W9cEeoy/f2rrMlBo6FN3SNQc2voCDsWius2gK2mtTTZZI0R33PguMmqTkwYVzP0hYplwSYh5Atl+XP7/xLRhhowanh9U6x2ahqfnNq5DInqi070bKk0xZ2g12Vg8kIRno8ZQmm+ujUUevRkZysHvnrnN01ZQhqzjo/Awn+Pft6LYleTBn+YU/HlPMWR4PsFcrtT3WRdF5samSvVwWuuOC+0td2zQN4nGpYLK+FmpNG4nDfGZV/xIBBblNRvzrhKgk3lDU5qkeQ/R0godRQGbv4J1kq+3WU2E3upqBYxXWUJLM5FirAxz8tKLmaPh8YZWMKcs3X9F2ySLEcnhe5R5F6LFSNx13zQSt7lGZOIgzhvWllcs4YVxcV1Y4rTJ8jEK2KgWua+bZinJPYUJqKTzO2E=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKk0X8hfHP7BSAAI8BDwrr4175ddN6MsanEqlp3oVMOvThKVXLpFXhvJPbq2IBTd3Wm12dL2vAW7/82zG63KYZk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDVN13dHSxa36Blsqt/Q8OyOA04CC7ZlvrS6zWL4aDyE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_pkg_mgr": "dnf", "ansible_is_chroot": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "32", "second": "09", "epoch": "1726882329", "epoch_int": "1726882329", "date": "2024-09-20", "time": "21:32:09", "iso8601_micro": "2024-09-21T01:32:09.909561Z", "iso8601": "2024-09-21T01:32:09Z", "iso8601_basic": "20240920T213209909561", "iso8601_basic_short": "20240920T213209", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/1", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 52586 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 52586 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_lsb": {}, "ansible_iscsi_iqn": "", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_local": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_apparmor": {"status": "disabled"}, "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1030:bff:fea1:4223", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::1030:bff:fea1:4223"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1030:bff:fea1:4223"]}, "ansible_loadavg": {"1m": 0.4921875, "5m": 0.3232421875, "15m": 0.15673828125}, "ansible_service_mgr": "systemd", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2934, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 597, "free": 2934}, "nocache": {"free": 3272, "used": 259}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_uuid": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 763, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794553856, "block_size": 4096, "block_total": 65519099, "block_available": 63914686, "block_used": 1604413, "inode_total": 131070960, "inode_available": 131029045, "inode_used": 41915, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 18699 1726882330.35580: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882329.3796167-18886-122063217402507/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18699 1726882330.35619: _low_level_execute_command(): starting 18699 1726882330.35645: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882329.3796167-18886-122063217402507/ > /dev/null 2>&1 && sleep 0' 18699 1726882330.36751: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882330.36819: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882330.36865: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882330.36902: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882330.36970: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 18699 1726882330.39502: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882330.39533: stderr chunk (state=3): >>><<< 18699 1726882330.39536: stdout chunk (state=3): >>><<< 18699 1726882330.39552: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 18699 1726882330.39559: handler run complete 18699 1726882330.39633: variable 'ansible_facts' from source: unknown 18699 1726882330.39708: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882330.39886: variable 'ansible_facts' from source: unknown 18699 1726882330.39939: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882330.40018: attempt loop complete, returning result 18699 1726882330.40022: _execute() done 18699 1726882330.40024: dumping result to json 18699 1726882330.40043: done dumping result, returning 18699 1726882330.40050: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [12673a56-9f93-1ce6-d207-0000000000d8] 18699 1726882330.40053: sending task result for task 12673a56-9f93-1ce6-d207-0000000000d8 18699 1726882330.40303: done sending task result for task 12673a56-9f93-1ce6-d207-0000000000d8 18699 1726882330.40305: WORKER PROCESS EXITING ok: [managed_node1] 18699 1726882330.40517: no more pending results, returning what we have 18699 1726882330.40519: results queue empty 18699 1726882330.40520: checking for any_errors_fatal 18699 1726882330.40521: done checking for any_errors_fatal 18699 1726882330.40521: checking for max_fail_percentage 18699 1726882330.40522: done checking for max_fail_percentage 18699 1726882330.40523: checking to see if all hosts have failed and the running result is not ok 18699 1726882330.40524: done checking to see if all hosts have failed 18699 1726882330.40524: getting the remaining hosts for this loop 18699 1726882330.40525: done getting the remaining hosts for this loop 18699 1726882330.40527: getting the next task for host managed_node1 18699 1726882330.40531: done getting next task for host managed_node1 18699 1726882330.40532: ^ task is: TASK: meta (flush_handlers) 18699 1726882330.40533: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882330.40536: getting variables 18699 1726882330.40537: in VariableManager get_vars() 18699 1726882330.40553: Calling all_inventory to load vars for managed_node1 18699 1726882330.40555: Calling groups_inventory to load vars for managed_node1 18699 1726882330.40557: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882330.40564: Calling all_plugins_play to load vars for managed_node1 18699 1726882330.40566: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882330.40568: Calling groups_plugins_play to load vars for managed_node1 18699 1726882330.40667: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882330.40778: done with get_vars() 18699 1726882330.40786: done getting variables 18699 1726882330.40840: in VariableManager get_vars() 18699 1726882330.40847: Calling all_inventory to load vars for managed_node1 18699 1726882330.40849: Calling groups_inventory to load vars for managed_node1 18699 1726882330.40851: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882330.40854: Calling all_plugins_play to load vars for managed_node1 18699 1726882330.40855: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882330.40856: Calling groups_plugins_play to load vars for managed_node1 18699 1726882330.40950: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882330.41062: done with get_vars() 18699 1726882330.41071: done queuing things up, now waiting for results queue to drain 18699 1726882330.41072: results queue empty 18699 1726882330.41073: checking for any_errors_fatal 18699 1726882330.41075: done checking for any_errors_fatal 18699 1726882330.41075: checking for max_fail_percentage 18699 1726882330.41076: done checking for max_fail_percentage 18699 1726882330.41077: checking to see if all hosts have failed and the running result is not ok 18699 1726882330.41077: done checking to see if all hosts have failed 18699 1726882330.41081: getting the remaining hosts for this loop 18699 1726882330.41082: done getting the remaining hosts for this loop 18699 1726882330.41084: getting the next task for host managed_node1 18699 1726882330.41086: done getting next task for host managed_node1 18699 1726882330.41088: ^ task is: TASK: Show inside ethernet tests 18699 1726882330.41089: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882330.41090: getting variables 18699 1726882330.41091: in VariableManager get_vars() 18699 1726882330.41099: Calling all_inventory to load vars for managed_node1 18699 1726882330.41101: Calling groups_inventory to load vars for managed_node1 18699 1726882330.41102: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882330.41106: Calling all_plugins_play to load vars for managed_node1 18699 1726882330.41107: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882330.41109: Calling groups_plugins_play to load vars for managed_node1 18699 1726882330.41188: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882330.41300: done with get_vars() 18699 1726882330.41305: done getting variables 18699 1726882330.41360: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Show inside ethernet tests] ********************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:6 Friday 20 September 2024 21:32:10 -0400 (0:00:01.078) 0:00:04.009 ****** 18699 1726882330.41380: entering _queue_task() for managed_node1/debug 18699 1726882330.41382: Creating lock for debug 18699 1726882330.41612: worker is 1 (out of 1 available) 18699 1726882330.41626: exiting _queue_task() for managed_node1/debug 18699 1726882330.41638: done queuing things up, now waiting for results queue to drain 18699 1726882330.41639: waiting for pending results... 18699 1726882330.41782: running TaskExecutor() for managed_node1/TASK: Show inside ethernet tests 18699 1726882330.41838: in run() - task 12673a56-9f93-1ce6-d207-00000000000b 18699 1726882330.41849: variable 'ansible_search_path' from source: unknown 18699 1726882330.41881: calling self._execute() 18699 1726882330.41940: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882330.41944: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882330.41953: variable 'omit' from source: magic vars 18699 1726882330.42222: variable 'ansible_distribution_major_version' from source: facts 18699 1726882330.42232: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882330.42238: variable 'omit' from source: magic vars 18699 1726882330.42259: variable 'omit' from source: magic vars 18699 1726882330.42284: variable 'omit' from source: magic vars 18699 1726882330.42319: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18699 1726882330.42348: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18699 1726882330.42365: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18699 1726882330.42378: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882330.42388: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882330.42416: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18699 1726882330.42419: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882330.42422: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882330.42489: Set connection var ansible_connection to ssh 18699 1726882330.42499: Set connection var ansible_pipelining to False 18699 1726882330.42502: Set connection var ansible_shell_executable to /bin/sh 18699 1726882330.42508: Set connection var ansible_timeout to 10 18699 1726882330.42511: Set connection var ansible_shell_type to sh 18699 1726882330.42516: Set connection var ansible_module_compression to ZIP_DEFLATED 18699 1726882330.42541: variable 'ansible_shell_executable' from source: unknown 18699 1726882330.42545: variable 'ansible_connection' from source: unknown 18699 1726882330.42548: variable 'ansible_module_compression' from source: unknown 18699 1726882330.42550: variable 'ansible_shell_type' from source: unknown 18699 1726882330.42553: variable 'ansible_shell_executable' from source: unknown 18699 1726882330.42555: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882330.42562: variable 'ansible_pipelining' from source: unknown 18699 1726882330.42564: variable 'ansible_timeout' from source: unknown 18699 1726882330.42566: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882330.42673: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18699 1726882330.42678: variable 'omit' from source: magic vars 18699 1726882330.42685: starting attempt loop 18699 1726882330.42687: running the handler 18699 1726882330.42726: handler run complete 18699 1726882330.42745: attempt loop complete, returning result 18699 1726882330.42748: _execute() done 18699 1726882330.42751: dumping result to json 18699 1726882330.42753: done dumping result, returning 18699 1726882330.42759: done running TaskExecutor() for managed_node1/TASK: Show inside ethernet tests [12673a56-9f93-1ce6-d207-00000000000b] 18699 1726882330.42762: sending task result for task 12673a56-9f93-1ce6-d207-00000000000b 18699 1726882330.42847: done sending task result for task 12673a56-9f93-1ce6-d207-00000000000b 18699 1726882330.42850: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: Inside ethernet tests 18699 1726882330.42898: no more pending results, returning what we have 18699 1726882330.42901: results queue empty 18699 1726882330.42902: checking for any_errors_fatal 18699 1726882330.42903: done checking for any_errors_fatal 18699 1726882330.42904: checking for max_fail_percentage 18699 1726882330.42905: done checking for max_fail_percentage 18699 1726882330.42906: checking to see if all hosts have failed and the running result is not ok 18699 1726882330.42907: done checking to see if all hosts have failed 18699 1726882330.42907: getting the remaining hosts for this loop 18699 1726882330.42909: done getting the remaining hosts for this loop 18699 1726882330.42912: getting the next task for host managed_node1 18699 1726882330.42918: done getting next task for host managed_node1 18699 1726882330.42921: ^ task is: TASK: Show network_provider 18699 1726882330.42922: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882330.42925: getting variables 18699 1726882330.42927: in VariableManager get_vars() 18699 1726882330.42952: Calling all_inventory to load vars for managed_node1 18699 1726882330.42955: Calling groups_inventory to load vars for managed_node1 18699 1726882330.42960: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882330.42969: Calling all_plugins_play to load vars for managed_node1 18699 1726882330.42972: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882330.42974: Calling groups_plugins_play to load vars for managed_node1 18699 1726882330.43144: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882330.43255: done with get_vars() 18699 1726882330.43262: done getting variables 18699 1726882330.43305: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show network_provider] *************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:9 Friday 20 September 2024 21:32:10 -0400 (0:00:00.019) 0:00:04.029 ****** 18699 1726882330.43325: entering _queue_task() for managed_node1/debug 18699 1726882330.43520: worker is 1 (out of 1 available) 18699 1726882330.43533: exiting _queue_task() for managed_node1/debug 18699 1726882330.43544: done queuing things up, now waiting for results queue to drain 18699 1726882330.43545: waiting for pending results... 18699 1726882330.43698: running TaskExecutor() for managed_node1/TASK: Show network_provider 18699 1726882330.43748: in run() - task 12673a56-9f93-1ce6-d207-00000000000c 18699 1726882330.43759: variable 'ansible_search_path' from source: unknown 18699 1726882330.43790: calling self._execute() 18699 1726882330.43846: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882330.43850: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882330.43860: variable 'omit' from source: magic vars 18699 1726882330.44126: variable 'ansible_distribution_major_version' from source: facts 18699 1726882330.44136: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882330.44141: variable 'omit' from source: magic vars 18699 1726882330.44161: variable 'omit' from source: magic vars 18699 1726882330.44186: variable 'omit' from source: magic vars 18699 1726882330.44219: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18699 1726882330.44246: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18699 1726882330.44263: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18699 1726882330.44276: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882330.44286: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882330.44312: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18699 1726882330.44317: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882330.44319: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882330.44384: Set connection var ansible_connection to ssh 18699 1726882330.44390: Set connection var ansible_pipelining to False 18699 1726882330.44400: Set connection var ansible_shell_executable to /bin/sh 18699 1726882330.44403: Set connection var ansible_timeout to 10 18699 1726882330.44405: Set connection var ansible_shell_type to sh 18699 1726882330.44411: Set connection var ansible_module_compression to ZIP_DEFLATED 18699 1726882330.44433: variable 'ansible_shell_executable' from source: unknown 18699 1726882330.44438: variable 'ansible_connection' from source: unknown 18699 1726882330.44441: variable 'ansible_module_compression' from source: unknown 18699 1726882330.44443: variable 'ansible_shell_type' from source: unknown 18699 1726882330.44445: variable 'ansible_shell_executable' from source: unknown 18699 1726882330.44448: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882330.44450: variable 'ansible_pipelining' from source: unknown 18699 1726882330.44452: variable 'ansible_timeout' from source: unknown 18699 1726882330.44454: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882330.44553: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18699 1726882330.44560: variable 'omit' from source: magic vars 18699 1726882330.44570: starting attempt loop 18699 1726882330.44573: running the handler 18699 1726882330.44609: variable 'network_provider' from source: set_fact 18699 1726882330.44660: variable 'network_provider' from source: set_fact 18699 1726882330.44677: handler run complete 18699 1726882330.44697: attempt loop complete, returning result 18699 1726882330.44702: _execute() done 18699 1726882330.44706: dumping result to json 18699 1726882330.44709: done dumping result, returning 18699 1726882330.44716: done running TaskExecutor() for managed_node1/TASK: Show network_provider [12673a56-9f93-1ce6-d207-00000000000c] 18699 1726882330.44718: sending task result for task 12673a56-9f93-1ce6-d207-00000000000c 18699 1726882330.44795: done sending task result for task 12673a56-9f93-1ce6-d207-00000000000c 18699 1726882330.44798: WORKER PROCESS EXITING ok: [managed_node1] => { "network_provider": "nm" } 18699 1726882330.44840: no more pending results, returning what we have 18699 1726882330.44843: results queue empty 18699 1726882330.44844: checking for any_errors_fatal 18699 1726882330.44852: done checking for any_errors_fatal 18699 1726882330.44852: checking for max_fail_percentage 18699 1726882330.44854: done checking for max_fail_percentage 18699 1726882330.44855: checking to see if all hosts have failed and the running result is not ok 18699 1726882330.44855: done checking to see if all hosts have failed 18699 1726882330.44856: getting the remaining hosts for this loop 18699 1726882330.44857: done getting the remaining hosts for this loop 18699 1726882330.44861: getting the next task for host managed_node1 18699 1726882330.44867: done getting next task for host managed_node1 18699 1726882330.44869: ^ task is: TASK: meta (flush_handlers) 18699 1726882330.44871: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882330.44874: getting variables 18699 1726882330.44875: in VariableManager get_vars() 18699 1726882330.44901: Calling all_inventory to load vars for managed_node1 18699 1726882330.44904: Calling groups_inventory to load vars for managed_node1 18699 1726882330.44907: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882330.44915: Calling all_plugins_play to load vars for managed_node1 18699 1726882330.44918: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882330.44920: Calling groups_plugins_play to load vars for managed_node1 18699 1726882330.45053: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882330.45168: done with get_vars() 18699 1726882330.45174: done getting variables 18699 1726882330.45224: in VariableManager get_vars() 18699 1726882330.45231: Calling all_inventory to load vars for managed_node1 18699 1726882330.45232: Calling groups_inventory to load vars for managed_node1 18699 1726882330.45234: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882330.45237: Calling all_plugins_play to load vars for managed_node1 18699 1726882330.45238: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882330.45240: Calling groups_plugins_play to load vars for managed_node1 18699 1726882330.45342: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882330.45446: done with get_vars() 18699 1726882330.45454: done queuing things up, now waiting for results queue to drain 18699 1726882330.45455: results queue empty 18699 1726882330.45456: checking for any_errors_fatal 18699 1726882330.45457: done checking for any_errors_fatal 18699 1726882330.45458: checking for max_fail_percentage 18699 1726882330.45458: done checking for max_fail_percentage 18699 1726882330.45459: checking to see if all hosts have failed and the running result is not ok 18699 1726882330.45459: done checking to see if all hosts have failed 18699 1726882330.45460: getting the remaining hosts for this loop 18699 1726882330.45460: done getting the remaining hosts for this loop 18699 1726882330.45462: getting the next task for host managed_node1 18699 1726882330.45467: done getting next task for host managed_node1 18699 1726882330.45468: ^ task is: TASK: meta (flush_handlers) 18699 1726882330.45469: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882330.45471: getting variables 18699 1726882330.45471: in VariableManager get_vars() 18699 1726882330.45476: Calling all_inventory to load vars for managed_node1 18699 1726882330.45477: Calling groups_inventory to load vars for managed_node1 18699 1726882330.45479: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882330.45481: Calling all_plugins_play to load vars for managed_node1 18699 1726882330.45483: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882330.45484: Calling groups_plugins_play to load vars for managed_node1 18699 1726882330.45566: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882330.45671: done with get_vars() 18699 1726882330.45677: done getting variables 18699 1726882330.45707: in VariableManager get_vars() 18699 1726882330.45713: Calling all_inventory to load vars for managed_node1 18699 1726882330.45714: Calling groups_inventory to load vars for managed_node1 18699 1726882330.45716: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882330.45718: Calling all_plugins_play to load vars for managed_node1 18699 1726882330.45719: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882330.45721: Calling groups_plugins_play to load vars for managed_node1 18699 1726882330.45802: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882330.45920: done with get_vars() 18699 1726882330.45928: done queuing things up, now waiting for results queue to drain 18699 1726882330.45929: results queue empty 18699 1726882330.45929: checking for any_errors_fatal 18699 1726882330.45930: done checking for any_errors_fatal 18699 1726882330.45930: checking for max_fail_percentage 18699 1726882330.45931: done checking for max_fail_percentage 18699 1726882330.45931: checking to see if all hosts have failed and the running result is not ok 18699 1726882330.45932: done checking to see if all hosts have failed 18699 1726882330.45932: getting the remaining hosts for this loop 18699 1726882330.45933: done getting the remaining hosts for this loop 18699 1726882330.45934: getting the next task for host managed_node1 18699 1726882330.45936: done getting next task for host managed_node1 18699 1726882330.45936: ^ task is: None 18699 1726882330.45937: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882330.45938: done queuing things up, now waiting for results queue to drain 18699 1726882330.45938: results queue empty 18699 1726882330.45939: checking for any_errors_fatal 18699 1726882330.45939: done checking for any_errors_fatal 18699 1726882330.45939: checking for max_fail_percentage 18699 1726882330.45940: done checking for max_fail_percentage 18699 1726882330.45940: checking to see if all hosts have failed and the running result is not ok 18699 1726882330.45941: done checking to see if all hosts have failed 18699 1726882330.45942: getting the next task for host managed_node1 18699 1726882330.45943: done getting next task for host managed_node1 18699 1726882330.45944: ^ task is: None 18699 1726882330.45944: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882330.45972: in VariableManager get_vars() 18699 1726882330.45985: done with get_vars() 18699 1726882330.45989: in VariableManager get_vars() 18699 1726882330.45997: done with get_vars() 18699 1726882330.46000: variable 'omit' from source: magic vars 18699 1726882330.46018: in VariableManager get_vars() 18699 1726882330.46025: done with get_vars() 18699 1726882330.46036: variable 'omit' from source: magic vars PLAY [Test configuring ethernet devices] *************************************** 18699 1726882330.46156: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 18699 1726882330.46173: getting the remaining hosts for this loop 18699 1726882330.46174: done getting the remaining hosts for this loop 18699 1726882330.46176: getting the next task for host managed_node1 18699 1726882330.46178: done getting next task for host managed_node1 18699 1726882330.46179: ^ task is: TASK: Gathering Facts 18699 1726882330.46180: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882330.46181: getting variables 18699 1726882330.46182: in VariableManager get_vars() 18699 1726882330.46187: Calling all_inventory to load vars for managed_node1 18699 1726882330.46188: Calling groups_inventory to load vars for managed_node1 18699 1726882330.46190: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882330.46196: Calling all_plugins_play to load vars for managed_node1 18699 1726882330.46198: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882330.46201: Calling groups_plugins_play to load vars for managed_node1 18699 1726882330.46275: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882330.46378: done with get_vars() 18699 1726882330.46383: done getting variables 18699 1726882330.46412: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:13 Friday 20 September 2024 21:32:10 -0400 (0:00:00.031) 0:00:04.060 ****** 18699 1726882330.46431: entering _queue_task() for managed_node1/gather_facts 18699 1726882330.46599: worker is 1 (out of 1 available) 18699 1726882330.46610: exiting _queue_task() for managed_node1/gather_facts 18699 1726882330.46621: done queuing things up, now waiting for results queue to drain 18699 1726882330.46622: waiting for pending results... 18699 1726882330.46754: running TaskExecutor() for managed_node1/TASK: Gathering Facts 18699 1726882330.46816: in run() - task 12673a56-9f93-1ce6-d207-0000000000f0 18699 1726882330.46829: variable 'ansible_search_path' from source: unknown 18699 1726882330.46861: calling self._execute() 18699 1726882330.46916: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882330.46919: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882330.46927: variable 'omit' from source: magic vars 18699 1726882330.47177: variable 'ansible_distribution_major_version' from source: facts 18699 1726882330.47195: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882330.47202: variable 'omit' from source: magic vars 18699 1726882330.47219: variable 'omit' from source: magic vars 18699 1726882330.47244: variable 'omit' from source: magic vars 18699 1726882330.47274: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18699 1726882330.47311: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18699 1726882330.47326: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18699 1726882330.47339: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882330.47348: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882330.47370: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18699 1726882330.47373: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882330.47376: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882330.47454: Set connection var ansible_connection to ssh 18699 1726882330.47462: Set connection var ansible_pipelining to False 18699 1726882330.47467: Set connection var ansible_shell_executable to /bin/sh 18699 1726882330.47472: Set connection var ansible_timeout to 10 18699 1726882330.47474: Set connection var ansible_shell_type to sh 18699 1726882330.47479: Set connection var ansible_module_compression to ZIP_DEFLATED 18699 1726882330.47509: variable 'ansible_shell_executable' from source: unknown 18699 1726882330.47512: variable 'ansible_connection' from source: unknown 18699 1726882330.47514: variable 'ansible_module_compression' from source: unknown 18699 1726882330.47517: variable 'ansible_shell_type' from source: unknown 18699 1726882330.47519: variable 'ansible_shell_executable' from source: unknown 18699 1726882330.47521: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882330.47524: variable 'ansible_pipelining' from source: unknown 18699 1726882330.47526: variable 'ansible_timeout' from source: unknown 18699 1726882330.47528: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882330.47655: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18699 1726882330.47662: variable 'omit' from source: magic vars 18699 1726882330.47666: starting attempt loop 18699 1726882330.47668: running the handler 18699 1726882330.47681: variable 'ansible_facts' from source: unknown 18699 1726882330.47696: _low_level_execute_command(): starting 18699 1726882330.47706: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18699 1726882330.48188: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882330.48225: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882330.48229: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 18699 1726882330.48232: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882330.48283: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882330.48286: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882330.48289: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882330.48340: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 18699 1726882330.50169: stdout chunk (state=3): >>>/root <<< 18699 1726882330.50313: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882330.50316: stdout chunk (state=3): >>><<< 18699 1726882330.50318: stderr chunk (state=3): >>><<< 18699 1726882330.50423: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 18699 1726882330.50428: _low_level_execute_command(): starting 18699 1726882330.50431: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882330.5034368-18924-1532620588031 `" && echo ansible-tmp-1726882330.5034368-18924-1532620588031="` echo /root/.ansible/tmp/ansible-tmp-1726882330.5034368-18924-1532620588031 `" ) && sleep 0' 18699 1726882330.50982: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882330.51002: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882330.51017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882330.51056: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882330.51164: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882330.51184: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882330.51204: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882330.51287: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882330.53484: stdout chunk (state=3): >>>ansible-tmp-1726882330.5034368-18924-1532620588031=/root/.ansible/tmp/ansible-tmp-1726882330.5034368-18924-1532620588031 <<< 18699 1726882330.53629: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882330.53657: stderr chunk (state=3): >>><<< 18699 1726882330.53661: stdout chunk (state=3): >>><<< 18699 1726882330.53677: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882330.5034368-18924-1532620588031=/root/.ansible/tmp/ansible-tmp-1726882330.5034368-18924-1532620588031 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882330.53706: variable 'ansible_module_compression' from source: unknown 18699 1726882330.53747: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18699f6i6z5dg/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 18699 1726882330.53797: variable 'ansible_facts' from source: unknown 18699 1726882330.53925: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882330.5034368-18924-1532620588031/AnsiballZ_setup.py 18699 1726882330.54122: Sending initial data 18699 1726882330.54125: Sent initial data (152 bytes) 18699 1726882330.54611: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 18699 1726882330.54615: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882330.54618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882330.54698: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882330.54740: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882330.56742: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 18699 1726882330.56746: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18699 1726882330.56779: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18699 1726882330.56825: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmp_xmd5nuo /root/.ansible/tmp/ansible-tmp-1726882330.5034368-18924-1532620588031/AnsiballZ_setup.py <<< 18699 1726882330.56833: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882330.5034368-18924-1532620588031/AnsiballZ_setup.py" <<< 18699 1726882330.56870: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmp_xmd5nuo" to remote "/root/.ansible/tmp/ansible-tmp-1726882330.5034368-18924-1532620588031/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882330.5034368-18924-1532620588031/AnsiballZ_setup.py" <<< 18699 1726882330.58161: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882330.58225: stderr chunk (state=3): >>><<< 18699 1726882330.58228: stdout chunk (state=3): >>><<< 18699 1726882330.58231: done transferring module to remote 18699 1726882330.58235: _low_level_execute_command(): starting 18699 1726882330.58245: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882330.5034368-18924-1532620588031/ /root/.ansible/tmp/ansible-tmp-1726882330.5034368-18924-1532620588031/AnsiballZ_setup.py && sleep 0' 18699 1726882330.58852: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882330.58855: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 18699 1726882330.58858: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 18699 1726882330.58860: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882330.58862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882330.58911: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882330.58958: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882330.58998: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882330.60900: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882330.60904: stderr chunk (state=3): >>><<< 18699 1726882330.60907: stdout chunk (state=3): >>><<< 18699 1726882330.60910: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882330.60912: _low_level_execute_command(): starting 18699 1726882330.60914: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882330.5034368-18924-1532620588031/AnsiballZ_setup.py && sleep 0' 18699 1726882330.61387: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882330.61402: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882330.61414: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882330.61429: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18699 1726882330.61453: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 18699 1726882330.61460: stderr chunk (state=3): >>>debug2: match not found <<< 18699 1726882330.61471: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882330.61486: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18699 1726882330.61495: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 18699 1726882330.61506: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 18699 1726882330.61515: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882330.61525: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882330.61538: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18699 1726882330.61545: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 18699 1726882330.61563: stderr chunk (state=3): >>>debug2: match found <<< 18699 1726882330.61573: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882330.61669: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882330.61676: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882330.61747: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882331.23738: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-159", "ansible_nodename": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d2d02cced42c36436217cb93f6b8e", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 52586 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 52586 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC9sgyYGKGPd0JFIDKIZZNkcX78Ca8OmX4GnOCt150Ftpgzzfir9Dy2HOb7d6QbQheoi9HLkHb66U2LDdt7EnBGKnI12YAuydTDfqITc2L4W9cEeoy/f2rrMlBo6FN3SNQc2voCDsWius2gK2mtTTZZI0R33PguMmqTkwYVzP0hYplwSYh5Atl+XP7/xLRhhowanh9U6x2ahqfnNq5DInqi070bKk0xZ2g12Vg8kIRno8ZQmm+ujUUevRkZysHvnrnN01ZQhqzjo/Awn+Pft6LYleTBn+YU/HlPMWR4PsFcrtT3WRdF5samSvVwWuuOC+0td2zQN4nGpYLK+FmpNG4nDfGZV/xIBBblNRvzrhKgk3lDU5qkeQ/R0godRQGbv4J1kq+3WU2E3upqBYxXWUJLM5FirAxz8tKLmaPh8YZWMKcs3X9F2ySLEcnhe5R5F6LFSNx13zQSt7lGZOIgzhvWllcs4YVxcV1Y4rTJ8jEK2KgWua+bZinJPYUJqKTzO2E=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKk0X8hfHP7BSAAI8BDwrr4175ddN6MsanEqlp3oVMOvThKVXLpFXhvJPbq2IBTd3Wm12dL2vAW7/82zG63KYZk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDVN13dHSxa36Blsqt/Q8OyOA04CC7ZlvrS6zWL4aDyE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_fips": false, "ansible_iscsi_iqn": "", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2960, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 571, "free": 2960}, "nocache": {"free": 3298, "used": 233}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_ch<<< 18699 1726882331.23752: stdout chunk (state=3): >>>assis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_uuid": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 764, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794738176, "block_size": 4096, "block_total": 65519099, "block_available": 63914731, "block_used": 1604368, "inode_total": 131070960, "inode_available": 131029047, "inode_used": 41913, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_apparmor": {"status": "disabled"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "32", "second": "11", "epoch": "1726882331", "epoch_int": "1726882331", "date": "2024-09-20", "time": "21:32:11", "iso8601_micro": "2024-09-21T01:32:11.197190Z", "iso8601": "2024-09-21T01:32:11Z", "iso8601_basic": "20240920T213211197190", "iso8601_basic_short": "20240920T213211", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_local": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_is_chroot": false, "ansible_fibre_channel_wwn": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_loadavg": {"1m": 0.4921875, "5m": 0.3232421875, "15m": 0.15673828125}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1030:bff:fea1:4223", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::1030:bff:fea1:4223"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1030:bff:fea1:4223"]}, "ansible_service_mgr": "systemd", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_lsb": {}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 18699 1726882331.25703: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 18699 1726882331.25730: stderr chunk (state=3): >>><<< 18699 1726882331.25733: stdout chunk (state=3): >>><<< 18699 1726882331.25763: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-159", "ansible_nodename": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d2d02cced42c36436217cb93f6b8e", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 52586 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 52586 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC9sgyYGKGPd0JFIDKIZZNkcX78Ca8OmX4GnOCt150Ftpgzzfir9Dy2HOb7d6QbQheoi9HLkHb66U2LDdt7EnBGKnI12YAuydTDfqITc2L4W9cEeoy/f2rrMlBo6FN3SNQc2voCDsWius2gK2mtTTZZI0R33PguMmqTkwYVzP0hYplwSYh5Atl+XP7/xLRhhowanh9U6x2ahqfnNq5DInqi070bKk0xZ2g12Vg8kIRno8ZQmm+ujUUevRkZysHvnrnN01ZQhqzjo/Awn+Pft6LYleTBn+YU/HlPMWR4PsFcrtT3WRdF5samSvVwWuuOC+0td2zQN4nGpYLK+FmpNG4nDfGZV/xIBBblNRvzrhKgk3lDU5qkeQ/R0godRQGbv4J1kq+3WU2E3upqBYxXWUJLM5FirAxz8tKLmaPh8YZWMKcs3X9F2ySLEcnhe5R5F6LFSNx13zQSt7lGZOIgzhvWllcs4YVxcV1Y4rTJ8jEK2KgWua+bZinJPYUJqKTzO2E=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKk0X8hfHP7BSAAI8BDwrr4175ddN6MsanEqlp3oVMOvThKVXLpFXhvJPbq2IBTd3Wm12dL2vAW7/82zG63KYZk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDVN13dHSxa36Blsqt/Q8OyOA04CC7ZlvrS6zWL4aDyE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_fips": false, "ansible_iscsi_iqn": "", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2960, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 571, "free": 2960}, "nocache": {"free": 3298, "used": 233}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_uuid": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 764, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794738176, "block_size": 4096, "block_total": 65519099, "block_available": 63914731, "block_used": 1604368, "inode_total": 131070960, "inode_available": 131029047, "inode_used": 41913, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_apparmor": {"status": "disabled"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "32", "second": "11", "epoch": "1726882331", "epoch_int": "1726882331", "date": "2024-09-20", "time": "21:32:11", "iso8601_micro": "2024-09-21T01:32:11.197190Z", "iso8601": "2024-09-21T01:32:11Z", "iso8601_basic": "20240920T213211197190", "iso8601_basic_short": "20240920T213211", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_local": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_is_chroot": false, "ansible_fibre_channel_wwn": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_loadavg": {"1m": 0.4921875, "5m": 0.3232421875, "15m": 0.15673828125}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1030:bff:fea1:4223", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::1030:bff:fea1:4223"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1030:bff:fea1:4223"]}, "ansible_service_mgr": "systemd", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_lsb": {}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 18699 1726882331.25956: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882330.5034368-18924-1532620588031/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18699 1726882331.25976: _low_level_execute_command(): starting 18699 1726882331.25979: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882330.5034368-18924-1532620588031/ > /dev/null 2>&1 && sleep 0' 18699 1726882331.26445: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882331.26448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882331.26450: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 18699 1726882331.26452: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18699 1726882331.26454: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882331.26511: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882331.26516: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882331.26518: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882331.26558: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882331.28346: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882331.28373: stderr chunk (state=3): >>><<< 18699 1726882331.28376: stdout chunk (state=3): >>><<< 18699 1726882331.28388: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882331.28404: handler run complete 18699 1726882331.28474: variable 'ansible_facts' from source: unknown 18699 1726882331.28548: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882331.28737: variable 'ansible_facts' from source: unknown 18699 1726882331.28788: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882331.28878: attempt loop complete, returning result 18699 1726882331.28882: _execute() done 18699 1726882331.28884: dumping result to json 18699 1726882331.28906: done dumping result, returning 18699 1726882331.28914: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [12673a56-9f93-1ce6-d207-0000000000f0] 18699 1726882331.28917: sending task result for task 12673a56-9f93-1ce6-d207-0000000000f0 ok: [managed_node1] 18699 1726882331.29390: no more pending results, returning what we have 18699 1726882331.29392: results queue empty 18699 1726882331.29396: checking for any_errors_fatal 18699 1726882331.29397: done checking for any_errors_fatal 18699 1726882331.29398: checking for max_fail_percentage 18699 1726882331.29399: done checking for max_fail_percentage 18699 1726882331.29399: checking to see if all hosts have failed and the running result is not ok 18699 1726882331.29400: done checking to see if all hosts have failed 18699 1726882331.29400: getting the remaining hosts for this loop 18699 1726882331.29401: done getting the remaining hosts for this loop 18699 1726882331.29404: getting the next task for host managed_node1 18699 1726882331.29407: done getting next task for host managed_node1 18699 1726882331.29408: ^ task is: TASK: meta (flush_handlers) 18699 1726882331.29409: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882331.29412: getting variables 18699 1726882331.29417: in VariableManager get_vars() 18699 1726882331.29433: Calling all_inventory to load vars for managed_node1 18699 1726882331.29434: Calling groups_inventory to load vars for managed_node1 18699 1726882331.29436: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882331.29445: Calling all_plugins_play to load vars for managed_node1 18699 1726882331.29446: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882331.29449: Calling groups_plugins_play to load vars for managed_node1 18699 1726882331.29553: done sending task result for task 12673a56-9f93-1ce6-d207-0000000000f0 18699 1726882331.29557: WORKER PROCESS EXITING 18699 1726882331.29567: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882331.29678: done with get_vars() 18699 1726882331.29684: done getting variables 18699 1726882331.29735: in VariableManager get_vars() 18699 1726882331.29743: Calling all_inventory to load vars for managed_node1 18699 1726882331.29745: Calling groups_inventory to load vars for managed_node1 18699 1726882331.29747: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882331.29750: Calling all_plugins_play to load vars for managed_node1 18699 1726882331.29751: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882331.29753: Calling groups_plugins_play to load vars for managed_node1 18699 1726882331.29834: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882331.29938: done with get_vars() 18699 1726882331.29946: done queuing things up, now waiting for results queue to drain 18699 1726882331.29948: results queue empty 18699 1726882331.29948: checking for any_errors_fatal 18699 1726882331.29950: done checking for any_errors_fatal 18699 1726882331.29950: checking for max_fail_percentage 18699 1726882331.29955: done checking for max_fail_percentage 18699 1726882331.29956: checking to see if all hosts have failed and the running result is not ok 18699 1726882331.29956: done checking to see if all hosts have failed 18699 1726882331.29957: getting the remaining hosts for this loop 18699 1726882331.29958: done getting the remaining hosts for this loop 18699 1726882331.29960: getting the next task for host managed_node1 18699 1726882331.29963: done getting next task for host managed_node1 18699 1726882331.29964: ^ task is: TASK: Set type={{ type }} and interface={{ interface }} 18699 1726882331.29965: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882331.29967: getting variables 18699 1726882331.29967: in VariableManager get_vars() 18699 1726882331.29972: Calling all_inventory to load vars for managed_node1 18699 1726882331.29973: Calling groups_inventory to load vars for managed_node1 18699 1726882331.29975: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882331.29978: Calling all_plugins_play to load vars for managed_node1 18699 1726882331.29979: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882331.29981: Calling groups_plugins_play to load vars for managed_node1 18699 1726882331.30058: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882331.30179: done with get_vars() 18699 1726882331.30185: done getting variables 18699 1726882331.30216: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 18699 1726882331.30317: variable 'type' from source: play vars 18699 1726882331.30321: variable 'interface' from source: play vars TASK [Set type=veth and interface=lsr27] *************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:20 Friday 20 September 2024 21:32:11 -0400 (0:00:00.839) 0:00:04.899 ****** 18699 1726882331.30347: entering _queue_task() for managed_node1/set_fact 18699 1726882331.30555: worker is 1 (out of 1 available) 18699 1726882331.30568: exiting _queue_task() for managed_node1/set_fact 18699 1726882331.30579: done queuing things up, now waiting for results queue to drain 18699 1726882331.30580: waiting for pending results... 18699 1726882331.30725: running TaskExecutor() for managed_node1/TASK: Set type=veth and interface=lsr27 18699 1726882331.30780: in run() - task 12673a56-9f93-1ce6-d207-00000000000f 18699 1726882331.30791: variable 'ansible_search_path' from source: unknown 18699 1726882331.30823: calling self._execute() 18699 1726882331.30879: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882331.30883: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882331.30891: variable 'omit' from source: magic vars 18699 1726882331.31150: variable 'ansible_distribution_major_version' from source: facts 18699 1726882331.31160: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882331.31171: variable 'omit' from source: magic vars 18699 1726882331.31189: variable 'omit' from source: magic vars 18699 1726882331.31213: variable 'type' from source: play vars 18699 1726882331.31267: variable 'type' from source: play vars 18699 1726882331.31278: variable 'interface' from source: play vars 18699 1726882331.31325: variable 'interface' from source: play vars 18699 1726882331.31337: variable 'omit' from source: magic vars 18699 1726882331.31368: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18699 1726882331.31400: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18699 1726882331.31415: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18699 1726882331.31428: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882331.31438: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882331.31462: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18699 1726882331.31465: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882331.31468: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882331.31537: Set connection var ansible_connection to ssh 18699 1726882331.31543: Set connection var ansible_pipelining to False 18699 1726882331.31549: Set connection var ansible_shell_executable to /bin/sh 18699 1726882331.31554: Set connection var ansible_timeout to 10 18699 1726882331.31556: Set connection var ansible_shell_type to sh 18699 1726882331.31561: Set connection var ansible_module_compression to ZIP_DEFLATED 18699 1726882331.31583: variable 'ansible_shell_executable' from source: unknown 18699 1726882331.31586: variable 'ansible_connection' from source: unknown 18699 1726882331.31588: variable 'ansible_module_compression' from source: unknown 18699 1726882331.31591: variable 'ansible_shell_type' from source: unknown 18699 1726882331.31597: variable 'ansible_shell_executable' from source: unknown 18699 1726882331.31600: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882331.31602: variable 'ansible_pipelining' from source: unknown 18699 1726882331.31607: variable 'ansible_timeout' from source: unknown 18699 1726882331.31609: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882331.31706: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18699 1726882331.31714: variable 'omit' from source: magic vars 18699 1726882331.31726: starting attempt loop 18699 1726882331.31728: running the handler 18699 1726882331.31734: handler run complete 18699 1726882331.31742: attempt loop complete, returning result 18699 1726882331.31744: _execute() done 18699 1726882331.31746: dumping result to json 18699 1726882331.31749: done dumping result, returning 18699 1726882331.31755: done running TaskExecutor() for managed_node1/TASK: Set type=veth and interface=lsr27 [12673a56-9f93-1ce6-d207-00000000000f] 18699 1726882331.31758: sending task result for task 12673a56-9f93-1ce6-d207-00000000000f 18699 1726882331.31836: done sending task result for task 12673a56-9f93-1ce6-d207-00000000000f 18699 1726882331.31839: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "interface": "lsr27", "type": "veth" }, "changed": false } 18699 1726882331.31887: no more pending results, returning what we have 18699 1726882331.31890: results queue empty 18699 1726882331.31891: checking for any_errors_fatal 18699 1726882331.31897: done checking for any_errors_fatal 18699 1726882331.31898: checking for max_fail_percentage 18699 1726882331.31899: done checking for max_fail_percentage 18699 1726882331.31900: checking to see if all hosts have failed and the running result is not ok 18699 1726882331.31901: done checking to see if all hosts have failed 18699 1726882331.31902: getting the remaining hosts for this loop 18699 1726882331.31903: done getting the remaining hosts for this loop 18699 1726882331.31906: getting the next task for host managed_node1 18699 1726882331.31912: done getting next task for host managed_node1 18699 1726882331.31914: ^ task is: TASK: Include the task 'show_interfaces.yml' 18699 1726882331.31916: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882331.31919: getting variables 18699 1726882331.31920: in VariableManager get_vars() 18699 1726882331.31944: Calling all_inventory to load vars for managed_node1 18699 1726882331.31947: Calling groups_inventory to load vars for managed_node1 18699 1726882331.31949: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882331.31958: Calling all_plugins_play to load vars for managed_node1 18699 1726882331.31960: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882331.31963: Calling groups_plugins_play to load vars for managed_node1 18699 1726882331.32098: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882331.32210: done with get_vars() 18699 1726882331.32220: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:24 Friday 20 September 2024 21:32:11 -0400 (0:00:00.019) 0:00:04.918 ****** 18699 1726882331.32277: entering _queue_task() for managed_node1/include_tasks 18699 1726882331.32475: worker is 1 (out of 1 available) 18699 1726882331.32488: exiting _queue_task() for managed_node1/include_tasks 18699 1726882331.32503: done queuing things up, now waiting for results queue to drain 18699 1726882331.32504: waiting for pending results... 18699 1726882331.32639: running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' 18699 1726882331.32689: in run() - task 12673a56-9f93-1ce6-d207-000000000010 18699 1726882331.32703: variable 'ansible_search_path' from source: unknown 18699 1726882331.32732: calling self._execute() 18699 1726882331.32786: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882331.32789: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882331.32802: variable 'omit' from source: magic vars 18699 1726882331.33061: variable 'ansible_distribution_major_version' from source: facts 18699 1726882331.33071: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882331.33076: _execute() done 18699 1726882331.33080: dumping result to json 18699 1726882331.33084: done dumping result, returning 18699 1726882331.33089: done running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' [12673a56-9f93-1ce6-d207-000000000010] 18699 1726882331.33098: sending task result for task 12673a56-9f93-1ce6-d207-000000000010 18699 1726882331.33182: done sending task result for task 12673a56-9f93-1ce6-d207-000000000010 18699 1726882331.33185: WORKER PROCESS EXITING 18699 1726882331.33230: no more pending results, returning what we have 18699 1726882331.33235: in VariableManager get_vars() 18699 1726882331.33261: Calling all_inventory to load vars for managed_node1 18699 1726882331.33263: Calling groups_inventory to load vars for managed_node1 18699 1726882331.33266: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882331.33275: Calling all_plugins_play to load vars for managed_node1 18699 1726882331.33277: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882331.33279: Calling groups_plugins_play to load vars for managed_node1 18699 1726882331.33440: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882331.33547: done with get_vars() 18699 1726882331.33552: variable 'ansible_search_path' from source: unknown 18699 1726882331.33561: we have included files to process 18699 1726882331.33561: generating all_blocks data 18699 1726882331.33562: done generating all_blocks data 18699 1726882331.33563: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 18699 1726882331.33563: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 18699 1726882331.33565: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 18699 1726882331.33667: in VariableManager get_vars() 18699 1726882331.33677: done with get_vars() 18699 1726882331.33761: done processing included file 18699 1726882331.33763: iterating over new_blocks loaded from include file 18699 1726882331.33764: in VariableManager get_vars() 18699 1726882331.33772: done with get_vars() 18699 1726882331.33773: filtering new block on tags 18699 1726882331.33782: done filtering new block on tags 18699 1726882331.33784: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node1 18699 1726882331.33787: extending task lists for all hosts with included blocks 18699 1726882331.33834: done extending task lists 18699 1726882331.33835: done processing included files 18699 1726882331.33836: results queue empty 18699 1726882331.33836: checking for any_errors_fatal 18699 1726882331.33838: done checking for any_errors_fatal 18699 1726882331.33838: checking for max_fail_percentage 18699 1726882331.33839: done checking for max_fail_percentage 18699 1726882331.33839: checking to see if all hosts have failed and the running result is not ok 18699 1726882331.33840: done checking to see if all hosts have failed 18699 1726882331.33840: getting the remaining hosts for this loop 18699 1726882331.33841: done getting the remaining hosts for this loop 18699 1726882331.33843: getting the next task for host managed_node1 18699 1726882331.33845: done getting next task for host managed_node1 18699 1726882331.33846: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 18699 1726882331.33848: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882331.33849: getting variables 18699 1726882331.33850: in VariableManager get_vars() 18699 1726882331.33855: Calling all_inventory to load vars for managed_node1 18699 1726882331.33856: Calling groups_inventory to load vars for managed_node1 18699 1726882331.33858: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882331.33861: Calling all_plugins_play to load vars for managed_node1 18699 1726882331.33862: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882331.33864: Calling groups_plugins_play to load vars for managed_node1 18699 1726882331.33947: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882331.34220: done with get_vars() 18699 1726882331.34226: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 21:32:11 -0400 (0:00:00.019) 0:00:04.938 ****** 18699 1726882331.34279: entering _queue_task() for managed_node1/include_tasks 18699 1726882331.34476: worker is 1 (out of 1 available) 18699 1726882331.34488: exiting _queue_task() for managed_node1/include_tasks 18699 1726882331.34503: done queuing things up, now waiting for results queue to drain 18699 1726882331.34505: waiting for pending results... 18699 1726882331.34657: running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' 18699 1726882331.34727: in run() - task 12673a56-9f93-1ce6-d207-000000000104 18699 1726882331.34736: variable 'ansible_search_path' from source: unknown 18699 1726882331.34741: variable 'ansible_search_path' from source: unknown 18699 1726882331.34777: calling self._execute() 18699 1726882331.34837: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882331.34841: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882331.34852: variable 'omit' from source: magic vars 18699 1726882331.35302: variable 'ansible_distribution_major_version' from source: facts 18699 1726882331.35306: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882331.35309: _execute() done 18699 1726882331.35311: dumping result to json 18699 1726882331.35313: done dumping result, returning 18699 1726882331.35316: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' [12673a56-9f93-1ce6-d207-000000000104] 18699 1726882331.35318: sending task result for task 12673a56-9f93-1ce6-d207-000000000104 18699 1726882331.35387: done sending task result for task 12673a56-9f93-1ce6-d207-000000000104 18699 1726882331.35390: WORKER PROCESS EXITING 18699 1726882331.35421: no more pending results, returning what we have 18699 1726882331.35426: in VariableManager get_vars() 18699 1726882331.35462: Calling all_inventory to load vars for managed_node1 18699 1726882331.35466: Calling groups_inventory to load vars for managed_node1 18699 1726882331.35470: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882331.35485: Calling all_plugins_play to load vars for managed_node1 18699 1726882331.35489: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882331.35492: Calling groups_plugins_play to load vars for managed_node1 18699 1726882331.35812: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882331.36084: done with get_vars() 18699 1726882331.36092: variable 'ansible_search_path' from source: unknown 18699 1726882331.36096: variable 'ansible_search_path' from source: unknown 18699 1726882331.36133: we have included files to process 18699 1726882331.36135: generating all_blocks data 18699 1726882331.36136: done generating all_blocks data 18699 1726882331.36137: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 18699 1726882331.36138: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 18699 1726882331.36140: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 18699 1726882331.36358: done processing included file 18699 1726882331.36360: iterating over new_blocks loaded from include file 18699 1726882331.36361: in VariableManager get_vars() 18699 1726882331.36371: done with get_vars() 18699 1726882331.36372: filtering new block on tags 18699 1726882331.36383: done filtering new block on tags 18699 1726882331.36384: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node1 18699 1726882331.36387: extending task lists for all hosts with included blocks 18699 1726882331.36449: done extending task lists 18699 1726882331.36450: done processing included files 18699 1726882331.36450: results queue empty 18699 1726882331.36451: checking for any_errors_fatal 18699 1726882331.36453: done checking for any_errors_fatal 18699 1726882331.36453: checking for max_fail_percentage 18699 1726882331.36454: done checking for max_fail_percentage 18699 1726882331.36454: checking to see if all hosts have failed and the running result is not ok 18699 1726882331.36455: done checking to see if all hosts have failed 18699 1726882331.36455: getting the remaining hosts for this loop 18699 1726882331.36456: done getting the remaining hosts for this loop 18699 1726882331.36457: getting the next task for host managed_node1 18699 1726882331.36460: done getting next task for host managed_node1 18699 1726882331.36461: ^ task is: TASK: Gather current interface info 18699 1726882331.36463: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882331.36464: getting variables 18699 1726882331.36465: in VariableManager get_vars() 18699 1726882331.36470: Calling all_inventory to load vars for managed_node1 18699 1726882331.36472: Calling groups_inventory to load vars for managed_node1 18699 1726882331.36474: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882331.36479: Calling all_plugins_play to load vars for managed_node1 18699 1726882331.36480: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882331.36482: Calling groups_plugins_play to load vars for managed_node1 18699 1726882331.36579: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882331.36684: done with get_vars() 18699 1726882331.36690: done getting variables 18699 1726882331.36719: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 21:32:11 -0400 (0:00:00.024) 0:00:04.963 ****** 18699 1726882331.36739: entering _queue_task() for managed_node1/command 18699 1726882331.36919: worker is 1 (out of 1 available) 18699 1726882331.36932: exiting _queue_task() for managed_node1/command 18699 1726882331.36942: done queuing things up, now waiting for results queue to drain 18699 1726882331.36943: waiting for pending results... 18699 1726882331.37116: running TaskExecutor() for managed_node1/TASK: Gather current interface info 18699 1726882331.37157: in run() - task 12673a56-9f93-1ce6-d207-000000000115 18699 1726882331.37175: variable 'ansible_search_path' from source: unknown 18699 1726882331.37179: variable 'ansible_search_path' from source: unknown 18699 1726882331.37206: calling self._execute() 18699 1726882331.37264: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882331.37267: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882331.37277: variable 'omit' from source: magic vars 18699 1726882331.37530: variable 'ansible_distribution_major_version' from source: facts 18699 1726882331.37539: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882331.37544: variable 'omit' from source: magic vars 18699 1726882331.37573: variable 'omit' from source: magic vars 18699 1726882331.37698: variable 'omit' from source: magic vars 18699 1726882331.37702: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18699 1726882331.37705: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18699 1726882331.37707: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18699 1726882331.37729: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882331.37746: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882331.37780: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18699 1726882331.37789: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882331.37800: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882331.37898: Set connection var ansible_connection to ssh 18699 1726882331.37913: Set connection var ansible_pipelining to False 18699 1726882331.37924: Set connection var ansible_shell_executable to /bin/sh 18699 1726882331.38064: Set connection var ansible_timeout to 10 18699 1726882331.38074: Set connection var ansible_shell_type to sh 18699 1726882331.38085: Set connection var ansible_module_compression to ZIP_DEFLATED 18699 1726882331.38122: variable 'ansible_shell_executable' from source: unknown 18699 1726882331.38132: variable 'ansible_connection' from source: unknown 18699 1726882331.38141: variable 'ansible_module_compression' from source: unknown 18699 1726882331.38149: variable 'ansible_shell_type' from source: unknown 18699 1726882331.38158: variable 'ansible_shell_executable' from source: unknown 18699 1726882331.38166: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882331.38300: variable 'ansible_pipelining' from source: unknown 18699 1726882331.38303: variable 'ansible_timeout' from source: unknown 18699 1726882331.38305: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882331.38336: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18699 1726882331.38354: variable 'omit' from source: magic vars 18699 1726882331.38365: starting attempt loop 18699 1726882331.38373: running the handler 18699 1726882331.38396: _low_level_execute_command(): starting 18699 1726882331.38412: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18699 1726882331.39101: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882331.39117: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882331.39131: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882331.39153: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18699 1726882331.39216: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882331.39269: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882331.39287: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882331.39314: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882331.39484: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882331.41084: stdout chunk (state=3): >>>/root <<< 18699 1726882331.41212: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882331.41234: stdout chunk (state=3): >>><<< 18699 1726882331.41248: stderr chunk (state=3): >>><<< 18699 1726882331.41272: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882331.41283: _low_level_execute_command(): starting 18699 1726882331.41289: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882331.4127202-18960-14037656566985 `" && echo ansible-tmp-1726882331.4127202-18960-14037656566985="` echo /root/.ansible/tmp/ansible-tmp-1726882331.4127202-18960-14037656566985 `" ) && sleep 0' 18699 1726882331.41731: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882331.41735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18699 1726882331.41737: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882331.41747: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882331.41749: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 18699 1726882331.41752: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882331.41789: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882331.41795: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882331.41845: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882331.43711: stdout chunk (state=3): >>>ansible-tmp-1726882331.4127202-18960-14037656566985=/root/.ansible/tmp/ansible-tmp-1726882331.4127202-18960-14037656566985 <<< 18699 1726882331.43814: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882331.43836: stderr chunk (state=3): >>><<< 18699 1726882331.43839: stdout chunk (state=3): >>><<< 18699 1726882331.43854: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882331.4127202-18960-14037656566985=/root/.ansible/tmp/ansible-tmp-1726882331.4127202-18960-14037656566985 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882331.43878: variable 'ansible_module_compression' from source: unknown 18699 1726882331.43924: ANSIBALLZ: Using generic lock for ansible.legacy.command 18699 1726882331.43927: ANSIBALLZ: Acquiring lock 18699 1726882331.43930: ANSIBALLZ: Lock acquired: 140254445799856 18699 1726882331.43932: ANSIBALLZ: Creating module 18699 1726882331.53113: ANSIBALLZ: Writing module into payload 18699 1726882331.53175: ANSIBALLZ: Writing module 18699 1726882331.53192: ANSIBALLZ: Renaming module 18699 1726882331.53199: ANSIBALLZ: Done creating module 18699 1726882331.53212: variable 'ansible_facts' from source: unknown 18699 1726882331.53256: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882331.4127202-18960-14037656566985/AnsiballZ_command.py 18699 1726882331.53353: Sending initial data 18699 1726882331.53356: Sent initial data (155 bytes) 18699 1726882331.53790: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882331.53798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 18699 1726882331.53802: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882331.53806: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882331.53809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882331.53905: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882331.53972: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882331.55555: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18699 1726882331.55601: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18699 1726882331.55638: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmpun_2khfr /root/.ansible/tmp/ansible-tmp-1726882331.4127202-18960-14037656566985/AnsiballZ_command.py <<< 18699 1726882331.55662: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882331.4127202-18960-14037656566985/AnsiballZ_command.py" <<< 18699 1726882331.55707: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmpun_2khfr" to remote "/root/.ansible/tmp/ansible-tmp-1726882331.4127202-18960-14037656566985/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882331.4127202-18960-14037656566985/AnsiballZ_command.py" <<< 18699 1726882331.56613: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882331.56616: stdout chunk (state=3): >>><<< 18699 1726882331.56618: stderr chunk (state=3): >>><<< 18699 1726882331.56692: done transferring module to remote 18699 1726882331.56718: _low_level_execute_command(): starting 18699 1726882331.56797: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882331.4127202-18960-14037656566985/ /root/.ansible/tmp/ansible-tmp-1726882331.4127202-18960-14037656566985/AnsiballZ_command.py && sleep 0' 18699 1726882331.57706: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882331.57723: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882331.57823: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882331.57867: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882331.57949: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882331.57989: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882331.59787: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882331.59790: stdout chunk (state=3): >>><<< 18699 1726882331.59797: stderr chunk (state=3): >>><<< 18699 1726882331.59813: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882331.59820: _low_level_execute_command(): starting 18699 1726882331.59891: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882331.4127202-18960-14037656566985/AnsiballZ_command.py && sleep 0' 18699 1726882331.60429: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882331.60449: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882331.60461: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882331.60478: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18699 1726882331.60509: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 18699 1726882331.60561: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882331.60615: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882331.60631: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882331.60671: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882331.60727: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882331.76026: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:32:11.755915", "end": "2024-09-20 21:32:11.758941", "delta": "0:00:00.003026", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 18699 1726882331.77447: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 18699 1726882331.77465: stderr chunk (state=3): >>><<< 18699 1726882331.77476: stdout chunk (state=3): >>><<< 18699 1726882331.77509: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:32:11.755915", "end": "2024-09-20 21:32:11.758941", "delta": "0:00:00.003026", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 18699 1726882331.77565: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882331.4127202-18960-14037656566985/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18699 1726882331.77569: _low_level_execute_command(): starting 18699 1726882331.77651: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882331.4127202-18960-14037656566985/ > /dev/null 2>&1 && sleep 0' 18699 1726882331.78252: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882331.78265: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882331.78281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882331.78318: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18699 1726882331.78427: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882331.78460: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882331.78537: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882331.80343: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882331.80362: stdout chunk (state=3): >>><<< 18699 1726882331.80396: stderr chunk (state=3): >>><<< 18699 1726882331.80601: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882331.80605: handler run complete 18699 1726882331.80607: Evaluated conditional (False): False 18699 1726882331.80609: attempt loop complete, returning result 18699 1726882331.80611: _execute() done 18699 1726882331.80613: dumping result to json 18699 1726882331.80615: done dumping result, returning 18699 1726882331.80617: done running TaskExecutor() for managed_node1/TASK: Gather current interface info [12673a56-9f93-1ce6-d207-000000000115] 18699 1726882331.80619: sending task result for task 12673a56-9f93-1ce6-d207-000000000115 18699 1726882331.80699: done sending task result for task 12673a56-9f93-1ce6-d207-000000000115 18699 1726882331.80703: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003026", "end": "2024-09-20 21:32:11.758941", "rc": 0, "start": "2024-09-20 21:32:11.755915" } STDOUT: bonding_masters eth0 lo 18699 1726882331.80781: no more pending results, returning what we have 18699 1726882331.80784: results queue empty 18699 1726882331.80785: checking for any_errors_fatal 18699 1726882331.80787: done checking for any_errors_fatal 18699 1726882331.80788: checking for max_fail_percentage 18699 1726882331.80790: done checking for max_fail_percentage 18699 1726882331.80791: checking to see if all hosts have failed and the running result is not ok 18699 1726882331.80791: done checking to see if all hosts have failed 18699 1726882331.80792: getting the remaining hosts for this loop 18699 1726882331.80840: done getting the remaining hosts for this loop 18699 1726882331.80845: getting the next task for host managed_node1 18699 1726882331.80852: done getting next task for host managed_node1 18699 1726882331.80855: ^ task is: TASK: Set current_interfaces 18699 1726882331.80859: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882331.80863: getting variables 18699 1726882331.80864: in VariableManager get_vars() 18699 1726882331.80892: Calling all_inventory to load vars for managed_node1 18699 1726882331.80899: Calling groups_inventory to load vars for managed_node1 18699 1726882331.80902: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882331.81198: Calling all_plugins_play to load vars for managed_node1 18699 1726882331.81202: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882331.81205: Calling groups_plugins_play to load vars for managed_node1 18699 1726882331.81370: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882331.81598: done with get_vars() 18699 1726882331.81608: done getting variables 18699 1726882331.81663: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 21:32:11 -0400 (0:00:00.449) 0:00:05.413 ****** 18699 1726882331.81695: entering _queue_task() for managed_node1/set_fact 18699 1726882331.81936: worker is 1 (out of 1 available) 18699 1726882331.81947: exiting _queue_task() for managed_node1/set_fact 18699 1726882331.81957: done queuing things up, now waiting for results queue to drain 18699 1726882331.81958: waiting for pending results... 18699 1726882331.82407: running TaskExecutor() for managed_node1/TASK: Set current_interfaces 18699 1726882331.82412: in run() - task 12673a56-9f93-1ce6-d207-000000000116 18699 1726882331.82414: variable 'ansible_search_path' from source: unknown 18699 1726882331.82417: variable 'ansible_search_path' from source: unknown 18699 1726882331.82419: calling self._execute() 18699 1726882331.82422: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882331.82434: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882331.82450: variable 'omit' from source: magic vars 18699 1726882331.82794: variable 'ansible_distribution_major_version' from source: facts 18699 1726882331.82811: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882331.82820: variable 'omit' from source: magic vars 18699 1726882331.82871: variable 'omit' from source: magic vars 18699 1726882331.82981: variable '_current_interfaces' from source: set_fact 18699 1726882331.83047: variable 'omit' from source: magic vars 18699 1726882331.83091: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18699 1726882331.83131: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18699 1726882331.83155: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18699 1726882331.83175: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882331.83195: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882331.83229: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18699 1726882331.83237: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882331.83244: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882331.83342: Set connection var ansible_connection to ssh 18699 1726882331.83355: Set connection var ansible_pipelining to False 18699 1726882331.83364: Set connection var ansible_shell_executable to /bin/sh 18699 1726882331.83373: Set connection var ansible_timeout to 10 18699 1726882331.83379: Set connection var ansible_shell_type to sh 18699 1726882331.83406: Set connection var ansible_module_compression to ZIP_DEFLATED 18699 1726882331.83426: variable 'ansible_shell_executable' from source: unknown 18699 1726882331.83434: variable 'ansible_connection' from source: unknown 18699 1726882331.83440: variable 'ansible_module_compression' from source: unknown 18699 1726882331.83515: variable 'ansible_shell_type' from source: unknown 18699 1726882331.83518: variable 'ansible_shell_executable' from source: unknown 18699 1726882331.83520: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882331.83522: variable 'ansible_pipelining' from source: unknown 18699 1726882331.83524: variable 'ansible_timeout' from source: unknown 18699 1726882331.83526: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882331.83610: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18699 1726882331.83630: variable 'omit' from source: magic vars 18699 1726882331.83638: starting attempt loop 18699 1726882331.83644: running the handler 18699 1726882331.83657: handler run complete 18699 1726882331.83670: attempt loop complete, returning result 18699 1726882331.83676: _execute() done 18699 1726882331.83682: dumping result to json 18699 1726882331.83689: done dumping result, returning 18699 1726882331.83701: done running TaskExecutor() for managed_node1/TASK: Set current_interfaces [12673a56-9f93-1ce6-d207-000000000116] 18699 1726882331.83709: sending task result for task 12673a56-9f93-1ce6-d207-000000000116 ok: [managed_node1] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 18699 1726882331.83892: no more pending results, returning what we have 18699 1726882331.83896: results queue empty 18699 1726882331.83897: checking for any_errors_fatal 18699 1726882331.83910: done checking for any_errors_fatal 18699 1726882331.83911: checking for max_fail_percentage 18699 1726882331.83913: done checking for max_fail_percentage 18699 1726882331.83913: checking to see if all hosts have failed and the running result is not ok 18699 1726882331.83914: done checking to see if all hosts have failed 18699 1726882331.83915: getting the remaining hosts for this loop 18699 1726882331.83916: done getting the remaining hosts for this loop 18699 1726882331.83920: getting the next task for host managed_node1 18699 1726882331.83929: done getting next task for host managed_node1 18699 1726882331.83931: ^ task is: TASK: Show current_interfaces 18699 1726882331.83934: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882331.83937: getting variables 18699 1726882331.83939: in VariableManager get_vars() 18699 1726882331.83965: Calling all_inventory to load vars for managed_node1 18699 1726882331.83968: Calling groups_inventory to load vars for managed_node1 18699 1726882331.83972: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882331.83982: Calling all_plugins_play to load vars for managed_node1 18699 1726882331.83985: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882331.83987: Calling groups_plugins_play to load vars for managed_node1 18699 1726882331.84267: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882331.84578: done with get_vars() 18699 1726882331.84588: done getting variables 18699 1726882331.84619: done sending task result for task 12673a56-9f93-1ce6-d207-000000000116 18699 1726882331.84622: WORKER PROCESS EXITING 18699 1726882331.84650: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 21:32:11 -0400 (0:00:00.029) 0:00:05.442 ****** 18699 1726882331.84675: entering _queue_task() for managed_node1/debug 18699 1726882331.84922: worker is 1 (out of 1 available) 18699 1726882331.84935: exiting _queue_task() for managed_node1/debug 18699 1726882331.84947: done queuing things up, now waiting for results queue to drain 18699 1726882331.84948: waiting for pending results... 18699 1726882331.85205: running TaskExecutor() for managed_node1/TASK: Show current_interfaces 18699 1726882331.85452: in run() - task 12673a56-9f93-1ce6-d207-000000000105 18699 1726882331.85475: variable 'ansible_search_path' from source: unknown 18699 1726882331.85482: variable 'ansible_search_path' from source: unknown 18699 1726882331.85528: calling self._execute() 18699 1726882331.85629: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882331.85641: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882331.85661: variable 'omit' from source: magic vars 18699 1726882331.86062: variable 'ansible_distribution_major_version' from source: facts 18699 1726882331.86081: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882331.86100: variable 'omit' from source: magic vars 18699 1726882331.86145: variable 'omit' from source: magic vars 18699 1726882331.86251: variable 'current_interfaces' from source: set_fact 18699 1726882331.86284: variable 'omit' from source: magic vars 18699 1726882331.86340: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18699 1726882331.86380: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18699 1726882331.86412: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18699 1726882331.86444: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882331.86460: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882331.86496: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18699 1726882331.86507: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882331.86516: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882331.86625: Set connection var ansible_connection to ssh 18699 1726882331.86652: Set connection var ansible_pipelining to False 18699 1726882331.86747: Set connection var ansible_shell_executable to /bin/sh 18699 1726882331.86750: Set connection var ansible_timeout to 10 18699 1726882331.86752: Set connection var ansible_shell_type to sh 18699 1726882331.86754: Set connection var ansible_module_compression to ZIP_DEFLATED 18699 1726882331.86758: variable 'ansible_shell_executable' from source: unknown 18699 1726882331.86760: variable 'ansible_connection' from source: unknown 18699 1726882331.86762: variable 'ansible_module_compression' from source: unknown 18699 1726882331.86763: variable 'ansible_shell_type' from source: unknown 18699 1726882331.86765: variable 'ansible_shell_executable' from source: unknown 18699 1726882331.86767: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882331.86768: variable 'ansible_pipelining' from source: unknown 18699 1726882331.86770: variable 'ansible_timeout' from source: unknown 18699 1726882331.86774: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882331.87183: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18699 1726882331.87187: variable 'omit' from source: magic vars 18699 1726882331.87189: starting attempt loop 18699 1726882331.87192: running the handler 18699 1726882331.87268: handler run complete 18699 1726882331.87312: attempt loop complete, returning result 18699 1726882331.87340: _execute() done 18699 1726882331.87348: dumping result to json 18699 1726882331.87356: done dumping result, returning 18699 1726882331.87445: done running TaskExecutor() for managed_node1/TASK: Show current_interfaces [12673a56-9f93-1ce6-d207-000000000105] 18699 1726882331.87448: sending task result for task 12673a56-9f93-1ce6-d207-000000000105 18699 1726882331.87741: done sending task result for task 12673a56-9f93-1ce6-d207-000000000105 18699 1726882331.87745: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 18699 1726882331.87791: no more pending results, returning what we have 18699 1726882331.87799: results queue empty 18699 1726882331.87801: checking for any_errors_fatal 18699 1726882331.87806: done checking for any_errors_fatal 18699 1726882331.87807: checking for max_fail_percentage 18699 1726882331.87810: done checking for max_fail_percentage 18699 1726882331.87810: checking to see if all hosts have failed and the running result is not ok 18699 1726882331.87811: done checking to see if all hosts have failed 18699 1726882331.87812: getting the remaining hosts for this loop 18699 1726882331.87813: done getting the remaining hosts for this loop 18699 1726882331.87817: getting the next task for host managed_node1 18699 1726882331.87825: done getting next task for host managed_node1 18699 1726882331.87828: ^ task is: TASK: Include the task 'manage_test_interface.yml' 18699 1726882331.87830: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882331.87839: getting variables 18699 1726882331.87841: in VariableManager get_vars() 18699 1726882331.87870: Calling all_inventory to load vars for managed_node1 18699 1726882331.87873: Calling groups_inventory to load vars for managed_node1 18699 1726882331.87876: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882331.87887: Calling all_plugins_play to load vars for managed_node1 18699 1726882331.87890: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882331.87897: Calling groups_plugins_play to load vars for managed_node1 18699 1726882331.88623: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882331.89025: done with get_vars() 18699 1726882331.89038: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:26 Friday 20 September 2024 21:32:11 -0400 (0:00:00.044) 0:00:05.487 ****** 18699 1726882331.89167: entering _queue_task() for managed_node1/include_tasks 18699 1726882331.89764: worker is 1 (out of 1 available) 18699 1726882331.89777: exiting _queue_task() for managed_node1/include_tasks 18699 1726882331.89788: done queuing things up, now waiting for results queue to drain 18699 1726882331.89789: waiting for pending results... 18699 1726882331.90116: running TaskExecutor() for managed_node1/TASK: Include the task 'manage_test_interface.yml' 18699 1726882331.90265: in run() - task 12673a56-9f93-1ce6-d207-000000000011 18699 1726882331.90282: variable 'ansible_search_path' from source: unknown 18699 1726882331.90322: calling self._execute() 18699 1726882331.90600: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882331.90619: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882331.90641: variable 'omit' from source: magic vars 18699 1726882331.91088: variable 'ansible_distribution_major_version' from source: facts 18699 1726882331.91111: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882331.91208: _execute() done 18699 1726882331.91212: dumping result to json 18699 1726882331.91215: done dumping result, returning 18699 1726882331.91218: done running TaskExecutor() for managed_node1/TASK: Include the task 'manage_test_interface.yml' [12673a56-9f93-1ce6-d207-000000000011] 18699 1726882331.91222: sending task result for task 12673a56-9f93-1ce6-d207-000000000011 18699 1726882331.91288: done sending task result for task 12673a56-9f93-1ce6-d207-000000000011 18699 1726882331.91292: WORKER PROCESS EXITING 18699 1726882331.91342: no more pending results, returning what we have 18699 1726882331.91348: in VariableManager get_vars() 18699 1726882331.91381: Calling all_inventory to load vars for managed_node1 18699 1726882331.91384: Calling groups_inventory to load vars for managed_node1 18699 1726882331.91387: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882331.91401: Calling all_plugins_play to load vars for managed_node1 18699 1726882331.91404: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882331.91407: Calling groups_plugins_play to load vars for managed_node1 18699 1726882331.91835: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882331.92039: done with get_vars() 18699 1726882331.92046: variable 'ansible_search_path' from source: unknown 18699 1726882331.92058: we have included files to process 18699 1726882331.92059: generating all_blocks data 18699 1726882331.92060: done generating all_blocks data 18699 1726882331.92063: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 18699 1726882331.92065: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 18699 1726882331.92067: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 18699 1726882331.93104: in VariableManager get_vars() 18699 1726882331.93119: done with get_vars() 18699 1726882331.93523: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 18699 1726882331.94665: done processing included file 18699 1726882331.94667: iterating over new_blocks loaded from include file 18699 1726882331.94668: in VariableManager get_vars() 18699 1726882331.94679: done with get_vars() 18699 1726882331.94680: filtering new block on tags 18699 1726882331.94711: done filtering new block on tags 18699 1726882331.94714: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed_node1 18699 1726882331.94719: extending task lists for all hosts with included blocks 18699 1726882331.95083: done extending task lists 18699 1726882331.95084: done processing included files 18699 1726882331.95085: results queue empty 18699 1726882331.95086: checking for any_errors_fatal 18699 1726882331.95088: done checking for any_errors_fatal 18699 1726882331.95089: checking for max_fail_percentage 18699 1726882331.95090: done checking for max_fail_percentage 18699 1726882331.95091: checking to see if all hosts have failed and the running result is not ok 18699 1726882331.95091: done checking to see if all hosts have failed 18699 1726882331.95092: getting the remaining hosts for this loop 18699 1726882331.95095: done getting the remaining hosts for this loop 18699 1726882331.95097: getting the next task for host managed_node1 18699 1726882331.95100: done getting next task for host managed_node1 18699 1726882331.95102: ^ task is: TASK: Ensure state in ["present", "absent"] 18699 1726882331.95104: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882331.95106: getting variables 18699 1726882331.95107: in VariableManager get_vars() 18699 1726882331.95115: Calling all_inventory to load vars for managed_node1 18699 1726882331.95117: Calling groups_inventory to load vars for managed_node1 18699 1726882331.95120: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882331.95125: Calling all_plugins_play to load vars for managed_node1 18699 1726882331.95127: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882331.95130: Calling groups_plugins_play to load vars for managed_node1 18699 1726882331.95469: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882331.95648: done with get_vars() 18699 1726882331.95657: done getting variables 18699 1726882331.95719: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Friday 20 September 2024 21:32:11 -0400 (0:00:00.065) 0:00:05.553 ****** 18699 1726882331.95744: entering _queue_task() for managed_node1/fail 18699 1726882331.95746: Creating lock for fail 18699 1726882331.95999: worker is 1 (out of 1 available) 18699 1726882331.96011: exiting _queue_task() for managed_node1/fail 18699 1726882331.96020: done queuing things up, now waiting for results queue to drain 18699 1726882331.96022: waiting for pending results... 18699 1726882331.96415: running TaskExecutor() for managed_node1/TASK: Ensure state in ["present", "absent"] 18699 1726882331.96420: in run() - task 12673a56-9f93-1ce6-d207-000000000131 18699 1726882331.96424: variable 'ansible_search_path' from source: unknown 18699 1726882331.96426: variable 'ansible_search_path' from source: unknown 18699 1726882331.96428: calling self._execute() 18699 1726882331.96477: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882331.96486: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882331.96502: variable 'omit' from source: magic vars 18699 1726882331.96853: variable 'ansible_distribution_major_version' from source: facts 18699 1726882331.96868: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882331.97008: variable 'state' from source: include params 18699 1726882331.97019: Evaluated conditional (state not in ["present", "absent"]): False 18699 1726882331.97025: when evaluation is False, skipping this task 18699 1726882331.97032: _execute() done 18699 1726882331.97038: dumping result to json 18699 1726882331.97045: done dumping result, returning 18699 1726882331.97058: done running TaskExecutor() for managed_node1/TASK: Ensure state in ["present", "absent"] [12673a56-9f93-1ce6-d207-000000000131] 18699 1726882331.97067: sending task result for task 12673a56-9f93-1ce6-d207-000000000131 skipping: [managed_node1] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 18699 1726882331.97205: no more pending results, returning what we have 18699 1726882331.97209: results queue empty 18699 1726882331.97209: checking for any_errors_fatal 18699 1726882331.97211: done checking for any_errors_fatal 18699 1726882331.97212: checking for max_fail_percentage 18699 1726882331.97213: done checking for max_fail_percentage 18699 1726882331.97214: checking to see if all hosts have failed and the running result is not ok 18699 1726882331.97215: done checking to see if all hosts have failed 18699 1726882331.97216: getting the remaining hosts for this loop 18699 1726882331.97217: done getting the remaining hosts for this loop 18699 1726882331.97220: getting the next task for host managed_node1 18699 1726882331.97227: done getting next task for host managed_node1 18699 1726882331.97229: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 18699 1726882331.97232: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882331.97235: getting variables 18699 1726882331.97237: in VariableManager get_vars() 18699 1726882331.97265: Calling all_inventory to load vars for managed_node1 18699 1726882331.97267: Calling groups_inventory to load vars for managed_node1 18699 1726882331.97271: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882331.97283: Calling all_plugins_play to load vars for managed_node1 18699 1726882331.97286: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882331.97290: Calling groups_plugins_play to load vars for managed_node1 18699 1726882331.97603: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882331.97915: done with get_vars() 18699 1726882331.97923: done getting variables 18699 1726882331.97949: done sending task result for task 12673a56-9f93-1ce6-d207-000000000131 18699 1726882331.97952: WORKER PROCESS EXITING 18699 1726882331.97983: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Friday 20 September 2024 21:32:11 -0400 (0:00:00.022) 0:00:05.576 ****** 18699 1726882331.98017: entering _queue_task() for managed_node1/fail 18699 1726882331.98232: worker is 1 (out of 1 available) 18699 1726882331.98242: exiting _queue_task() for managed_node1/fail 18699 1726882331.98253: done queuing things up, now waiting for results queue to drain 18699 1726882331.98253: waiting for pending results... 18699 1726882331.98472: running TaskExecutor() for managed_node1/TASK: Ensure type in ["dummy", "tap", "veth"] 18699 1726882331.98565: in run() - task 12673a56-9f93-1ce6-d207-000000000132 18699 1726882331.98581: variable 'ansible_search_path' from source: unknown 18699 1726882331.98592: variable 'ansible_search_path' from source: unknown 18699 1726882331.98631: calling self._execute() 18699 1726882331.98711: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882331.98722: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882331.98737: variable 'omit' from source: magic vars 18699 1726882331.99068: variable 'ansible_distribution_major_version' from source: facts 18699 1726882331.99084: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882331.99226: variable 'type' from source: set_fact 18699 1726882331.99240: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 18699 1726882331.99248: when evaluation is False, skipping this task 18699 1726882331.99254: _execute() done 18699 1726882331.99261: dumping result to json 18699 1726882331.99267: done dumping result, returning 18699 1726882331.99276: done running TaskExecutor() for managed_node1/TASK: Ensure type in ["dummy", "tap", "veth"] [12673a56-9f93-1ce6-d207-000000000132] 18699 1726882331.99284: sending task result for task 12673a56-9f93-1ce6-d207-000000000132 skipping: [managed_node1] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 18699 1726882331.99403: no more pending results, returning what we have 18699 1726882331.99406: results queue empty 18699 1726882331.99407: checking for any_errors_fatal 18699 1726882331.99414: done checking for any_errors_fatal 18699 1726882331.99414: checking for max_fail_percentage 18699 1726882331.99416: done checking for max_fail_percentage 18699 1726882331.99417: checking to see if all hosts have failed and the running result is not ok 18699 1726882331.99417: done checking to see if all hosts have failed 18699 1726882331.99418: getting the remaining hosts for this loop 18699 1726882331.99419: done getting the remaining hosts for this loop 18699 1726882331.99422: getting the next task for host managed_node1 18699 1726882331.99428: done getting next task for host managed_node1 18699 1726882331.99430: ^ task is: TASK: Include the task 'show_interfaces.yml' 18699 1726882331.99433: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882331.99436: getting variables 18699 1726882331.99438: in VariableManager get_vars() 18699 1726882331.99463: Calling all_inventory to load vars for managed_node1 18699 1726882331.99465: Calling groups_inventory to load vars for managed_node1 18699 1726882331.99468: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882331.99478: Calling all_plugins_play to load vars for managed_node1 18699 1726882331.99480: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882331.99483: Calling groups_plugins_play to load vars for managed_node1 18699 1726882331.99825: done sending task result for task 12673a56-9f93-1ce6-d207-000000000132 18699 1726882331.99829: WORKER PROCESS EXITING 18699 1726882331.99849: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882332.00036: done with get_vars() 18699 1726882332.00045: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Friday 20 September 2024 21:32:11 -0400 (0:00:00.021) 0:00:05.597 ****** 18699 1726882332.00151: entering _queue_task() for managed_node1/include_tasks 18699 1726882332.00352: worker is 1 (out of 1 available) 18699 1726882332.00362: exiting _queue_task() for managed_node1/include_tasks 18699 1726882332.00372: done queuing things up, now waiting for results queue to drain 18699 1726882332.00373: waiting for pending results... 18699 1726882332.00628: running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' 18699 1726882332.00732: in run() - task 12673a56-9f93-1ce6-d207-000000000133 18699 1726882332.00748: variable 'ansible_search_path' from source: unknown 18699 1726882332.00754: variable 'ansible_search_path' from source: unknown 18699 1726882332.00806: calling self._execute() 18699 1726882332.00887: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882332.00900: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882332.00914: variable 'omit' from source: magic vars 18699 1726882332.01311: variable 'ansible_distribution_major_version' from source: facts 18699 1726882332.01327: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882332.01336: _execute() done 18699 1726882332.01343: dumping result to json 18699 1726882332.01350: done dumping result, returning 18699 1726882332.01359: done running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' [12673a56-9f93-1ce6-d207-000000000133] 18699 1726882332.01366: sending task result for task 12673a56-9f93-1ce6-d207-000000000133 18699 1726882332.01464: done sending task result for task 12673a56-9f93-1ce6-d207-000000000133 18699 1726882332.01504: no more pending results, returning what we have 18699 1726882332.01509: in VariableManager get_vars() 18699 1726882332.01541: Calling all_inventory to load vars for managed_node1 18699 1726882332.01544: Calling groups_inventory to load vars for managed_node1 18699 1726882332.01548: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882332.01559: Calling all_plugins_play to load vars for managed_node1 18699 1726882332.01562: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882332.01565: Calling groups_plugins_play to load vars for managed_node1 18699 1726882332.01930: WORKER PROCESS EXITING 18699 1726882332.01951: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882332.02127: done with get_vars() 18699 1726882332.02133: variable 'ansible_search_path' from source: unknown 18699 1726882332.02134: variable 'ansible_search_path' from source: unknown 18699 1726882332.02162: we have included files to process 18699 1726882332.02163: generating all_blocks data 18699 1726882332.02164: done generating all_blocks data 18699 1726882332.02169: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 18699 1726882332.02170: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 18699 1726882332.02171: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 18699 1726882332.02267: in VariableManager get_vars() 18699 1726882332.02283: done with get_vars() 18699 1726882332.02371: done processing included file 18699 1726882332.02373: iterating over new_blocks loaded from include file 18699 1726882332.02374: in VariableManager get_vars() 18699 1726882332.02384: done with get_vars() 18699 1726882332.02386: filtering new block on tags 18699 1726882332.02402: done filtering new block on tags 18699 1726882332.02405: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node1 18699 1726882332.02409: extending task lists for all hosts with included blocks 18699 1726882332.03099: done extending task lists 18699 1726882332.03100: done processing included files 18699 1726882332.03101: results queue empty 18699 1726882332.03102: checking for any_errors_fatal 18699 1726882332.03104: done checking for any_errors_fatal 18699 1726882332.03105: checking for max_fail_percentage 18699 1726882332.03106: done checking for max_fail_percentage 18699 1726882332.03107: checking to see if all hosts have failed and the running result is not ok 18699 1726882332.03238: done checking to see if all hosts have failed 18699 1726882332.03239: getting the remaining hosts for this loop 18699 1726882332.03241: done getting the remaining hosts for this loop 18699 1726882332.03244: getting the next task for host managed_node1 18699 1726882332.03248: done getting next task for host managed_node1 18699 1726882332.03250: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 18699 1726882332.03253: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882332.03255: getting variables 18699 1726882332.03256: in VariableManager get_vars() 18699 1726882332.03265: Calling all_inventory to load vars for managed_node1 18699 1726882332.03267: Calling groups_inventory to load vars for managed_node1 18699 1726882332.03269: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882332.03274: Calling all_plugins_play to load vars for managed_node1 18699 1726882332.03276: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882332.03279: Calling groups_plugins_play to load vars for managed_node1 18699 1726882332.03727: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882332.04106: done with get_vars() 18699 1726882332.04114: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 21:32:12 -0400 (0:00:00.040) 0:00:05.637 ****** 18699 1726882332.04181: entering _queue_task() for managed_node1/include_tasks 18699 1726882332.04711: worker is 1 (out of 1 available) 18699 1726882332.04721: exiting _queue_task() for managed_node1/include_tasks 18699 1726882332.04731: done queuing things up, now waiting for results queue to drain 18699 1726882332.04732: waiting for pending results... 18699 1726882332.05068: running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' 18699 1726882332.05252: in run() - task 12673a56-9f93-1ce6-d207-00000000015c 18699 1726882332.05449: variable 'ansible_search_path' from source: unknown 18699 1726882332.05453: variable 'ansible_search_path' from source: unknown 18699 1726882332.05456: calling self._execute() 18699 1726882332.05586: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882332.05603: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882332.05619: variable 'omit' from source: magic vars 18699 1726882332.06387: variable 'ansible_distribution_major_version' from source: facts 18699 1726882332.06407: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882332.06418: _execute() done 18699 1726882332.06428: dumping result to json 18699 1726882332.06436: done dumping result, returning 18699 1726882332.06447: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' [12673a56-9f93-1ce6-d207-00000000015c] 18699 1726882332.06456: sending task result for task 12673a56-9f93-1ce6-d207-00000000015c 18699 1726882332.06577: done sending task result for task 12673a56-9f93-1ce6-d207-00000000015c 18699 1726882332.06580: WORKER PROCESS EXITING 18699 1726882332.06632: no more pending results, returning what we have 18699 1726882332.06636: in VariableManager get_vars() 18699 1726882332.06690: Calling all_inventory to load vars for managed_node1 18699 1726882332.06692: Calling groups_inventory to load vars for managed_node1 18699 1726882332.06700: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882332.06711: Calling all_plugins_play to load vars for managed_node1 18699 1726882332.06713: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882332.06715: Calling groups_plugins_play to load vars for managed_node1 18699 1726882332.06944: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882332.07151: done with get_vars() 18699 1726882332.07165: variable 'ansible_search_path' from source: unknown 18699 1726882332.07166: variable 'ansible_search_path' from source: unknown 18699 1726882332.07236: we have included files to process 18699 1726882332.07237: generating all_blocks data 18699 1726882332.07239: done generating all_blocks data 18699 1726882332.07240: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 18699 1726882332.07241: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 18699 1726882332.07244: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 18699 1726882332.07608: done processing included file 18699 1726882332.07610: iterating over new_blocks loaded from include file 18699 1726882332.07612: in VariableManager get_vars() 18699 1726882332.07626: done with get_vars() 18699 1726882332.07628: filtering new block on tags 18699 1726882332.07646: done filtering new block on tags 18699 1726882332.07648: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node1 18699 1726882332.07652: extending task lists for all hosts with included blocks 18699 1726882332.07778: done extending task lists 18699 1726882332.07779: done processing included files 18699 1726882332.07780: results queue empty 18699 1726882332.07781: checking for any_errors_fatal 18699 1726882332.07783: done checking for any_errors_fatal 18699 1726882332.07783: checking for max_fail_percentage 18699 1726882332.07784: done checking for max_fail_percentage 18699 1726882332.07785: checking to see if all hosts have failed and the running result is not ok 18699 1726882332.07786: done checking to see if all hosts have failed 18699 1726882332.07786: getting the remaining hosts for this loop 18699 1726882332.07787: done getting the remaining hosts for this loop 18699 1726882332.07789: getting the next task for host managed_node1 18699 1726882332.07795: done getting next task for host managed_node1 18699 1726882332.07797: ^ task is: TASK: Gather current interface info 18699 1726882332.07800: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882332.07802: getting variables 18699 1726882332.07803: in VariableManager get_vars() 18699 1726882332.07811: Calling all_inventory to load vars for managed_node1 18699 1726882332.07813: Calling groups_inventory to load vars for managed_node1 18699 1726882332.07815: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882332.07819: Calling all_plugins_play to load vars for managed_node1 18699 1726882332.07822: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882332.07824: Calling groups_plugins_play to load vars for managed_node1 18699 1726882332.07976: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882332.08271: done with get_vars() 18699 1726882332.08279: done getting variables 18699 1726882332.08434: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 21:32:12 -0400 (0:00:00.042) 0:00:05.680 ****** 18699 1726882332.08462: entering _queue_task() for managed_node1/command 18699 1726882332.08906: worker is 1 (out of 1 available) 18699 1726882332.08914: exiting _queue_task() for managed_node1/command 18699 1726882332.08922: done queuing things up, now waiting for results queue to drain 18699 1726882332.08922: waiting for pending results... 18699 1726882332.09049: running TaskExecutor() for managed_node1/TASK: Gather current interface info 18699 1726882332.09077: in run() - task 12673a56-9f93-1ce6-d207-000000000193 18699 1726882332.09098: variable 'ansible_search_path' from source: unknown 18699 1726882332.09106: variable 'ansible_search_path' from source: unknown 18699 1726882332.09255: calling self._execute() 18699 1726882332.09258: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882332.09261: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882332.09263: variable 'omit' from source: magic vars 18699 1726882332.09648: variable 'ansible_distribution_major_version' from source: facts 18699 1726882332.09664: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882332.09675: variable 'omit' from source: magic vars 18699 1726882332.09759: variable 'omit' from source: magic vars 18699 1726882332.09828: variable 'omit' from source: magic vars 18699 1726882332.09895: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18699 1726882332.09943: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18699 1726882332.09971: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18699 1726882332.09996: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882332.10015: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882332.10050: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18699 1726882332.10058: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882332.10065: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882332.10164: Set connection var ansible_connection to ssh 18699 1726882332.10176: Set connection var ansible_pipelining to False 18699 1726882332.10186: Set connection var ansible_shell_executable to /bin/sh 18699 1726882332.10239: Set connection var ansible_timeout to 10 18699 1726882332.10242: Set connection var ansible_shell_type to sh 18699 1726882332.10244: Set connection var ansible_module_compression to ZIP_DEFLATED 18699 1726882332.10246: variable 'ansible_shell_executable' from source: unknown 18699 1726882332.10248: variable 'ansible_connection' from source: unknown 18699 1726882332.10253: variable 'ansible_module_compression' from source: unknown 18699 1726882332.10259: variable 'ansible_shell_type' from source: unknown 18699 1726882332.10265: variable 'ansible_shell_executable' from source: unknown 18699 1726882332.10271: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882332.10277: variable 'ansible_pipelining' from source: unknown 18699 1726882332.10282: variable 'ansible_timeout' from source: unknown 18699 1726882332.10289: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882332.10425: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18699 1726882332.10455: variable 'omit' from source: magic vars 18699 1726882332.10458: starting attempt loop 18699 1726882332.10460: running the handler 18699 1726882332.10565: _low_level_execute_command(): starting 18699 1726882332.10569: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18699 1726882332.11172: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882332.11232: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882332.11298: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882332.11317: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882332.11348: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882332.11419: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882332.13103: stdout chunk (state=3): >>>/root <<< 18699 1726882332.13187: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882332.13302: stdout chunk (state=3): >>><<< 18699 1726882332.13305: stderr chunk (state=3): >>><<< 18699 1726882332.13309: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882332.13405: _low_level_execute_command(): starting 18699 1726882332.13409: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882332.133087-18993-201352290790155 `" && echo ansible-tmp-1726882332.133087-18993-201352290790155="` echo /root/.ansible/tmp/ansible-tmp-1726882332.133087-18993-201352290790155 `" ) && sleep 0' 18699 1726882332.14005: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882332.14023: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882332.14044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882332.14168: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882332.14210: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882332.14302: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882332.16168: stdout chunk (state=3): >>>ansible-tmp-1726882332.133087-18993-201352290790155=/root/.ansible/tmp/ansible-tmp-1726882332.133087-18993-201352290790155 <<< 18699 1726882332.16299: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882332.16368: stderr chunk (state=3): >>><<< 18699 1726882332.16372: stdout chunk (state=3): >>><<< 18699 1726882332.16375: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882332.133087-18993-201352290790155=/root/.ansible/tmp/ansible-tmp-1726882332.133087-18993-201352290790155 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882332.16587: variable 'ansible_module_compression' from source: unknown 18699 1726882332.16591: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18699f6i6z5dg/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 18699 1726882332.16596: variable 'ansible_facts' from source: unknown 18699 1726882332.16738: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882332.133087-18993-201352290790155/AnsiballZ_command.py 18699 1726882332.17079: Sending initial data 18699 1726882332.17129: Sent initial data (155 bytes) 18699 1726882332.18412: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882332.18471: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882332.18489: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882332.18583: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882332.20091: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18699 1726882332.20152: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18699 1726882332.20211: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmp2ml9jo8k /root/.ansible/tmp/ansible-tmp-1726882332.133087-18993-201352290790155/AnsiballZ_command.py <<< 18699 1726882332.20225: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882332.133087-18993-201352290790155/AnsiballZ_command.py" <<< 18699 1726882332.20284: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmp2ml9jo8k" to remote "/root/.ansible/tmp/ansible-tmp-1726882332.133087-18993-201352290790155/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882332.133087-18993-201352290790155/AnsiballZ_command.py" <<< 18699 1726882332.21222: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882332.21407: stderr chunk (state=3): >>><<< 18699 1726882332.21411: stdout chunk (state=3): >>><<< 18699 1726882332.21413: done transferring module to remote 18699 1726882332.21415: _low_level_execute_command(): starting 18699 1726882332.21417: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882332.133087-18993-201352290790155/ /root/.ansible/tmp/ansible-tmp-1726882332.133087-18993-201352290790155/AnsiballZ_command.py && sleep 0' 18699 1726882332.22167: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882332.22171: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 18699 1726882332.22206: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 18699 1726882332.22210: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 18699 1726882332.22213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882332.22251: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882332.22263: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882332.22317: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882332.24085: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882332.24114: stdout chunk (state=3): >>><<< 18699 1726882332.24118: stderr chunk (state=3): >>><<< 18699 1726882332.24218: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882332.24226: _low_level_execute_command(): starting 18699 1726882332.24229: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882332.133087-18993-201352290790155/AnsiballZ_command.py && sleep 0' 18699 1726882332.24783: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882332.24900: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882332.24939: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882332.24979: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882332.40132: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:32:12.397111", "end": "2024-09-20 21:32:12.400122", "delta": "0:00:00.003011", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 18699 1726882332.41904: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 18699 1726882332.42005: stdout chunk (state=3): >>><<< 18699 1726882332.42009: stderr chunk (state=3): >>><<< 18699 1726882332.42013: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:32:12.397111", "end": "2024-09-20 21:32:12.400122", "delta": "0:00:00.003011", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 18699 1726882332.42016: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882332.133087-18993-201352290790155/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18699 1726882332.42018: _low_level_execute_command(): starting 18699 1726882332.42021: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882332.133087-18993-201352290790155/ > /dev/null 2>&1 && sleep 0' 18699 1726882332.43273: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882332.43314: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882332.43414: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882332.43451: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882332.45285: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882332.45290: stdout chunk (state=3): >>><<< 18699 1726882332.45292: stderr chunk (state=3): >>><<< 18699 1726882332.45311: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882332.45321: handler run complete 18699 1726882332.45354: Evaluated conditional (False): False 18699 1726882332.45409: attempt loop complete, returning result 18699 1726882332.45447: _execute() done 18699 1726882332.45455: dumping result to json 18699 1726882332.45551: done dumping result, returning 18699 1726882332.45554: done running TaskExecutor() for managed_node1/TASK: Gather current interface info [12673a56-9f93-1ce6-d207-000000000193] 18699 1726882332.45557: sending task result for task 12673a56-9f93-1ce6-d207-000000000193 18699 1726882332.45847: done sending task result for task 12673a56-9f93-1ce6-d207-000000000193 18699 1726882332.45850: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003011", "end": "2024-09-20 21:32:12.400122", "rc": 0, "start": "2024-09-20 21:32:12.397111" } STDOUT: bonding_masters eth0 lo 18699 1726882332.45947: no more pending results, returning what we have 18699 1726882332.45950: results queue empty 18699 1726882332.45951: checking for any_errors_fatal 18699 1726882332.45953: done checking for any_errors_fatal 18699 1726882332.45954: checking for max_fail_percentage 18699 1726882332.45955: done checking for max_fail_percentage 18699 1726882332.45956: checking to see if all hosts have failed and the running result is not ok 18699 1726882332.45957: done checking to see if all hosts have failed 18699 1726882332.45958: getting the remaining hosts for this loop 18699 1726882332.45959: done getting the remaining hosts for this loop 18699 1726882332.45962: getting the next task for host managed_node1 18699 1726882332.45970: done getting next task for host managed_node1 18699 1726882332.45972: ^ task is: TASK: Set current_interfaces 18699 1726882332.45977: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882332.45981: getting variables 18699 1726882332.45983: in VariableManager get_vars() 18699 1726882332.46012: Calling all_inventory to load vars for managed_node1 18699 1726882332.46015: Calling groups_inventory to load vars for managed_node1 18699 1726882332.46018: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882332.46030: Calling all_plugins_play to load vars for managed_node1 18699 1726882332.46033: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882332.46036: Calling groups_plugins_play to load vars for managed_node1 18699 1726882332.46811: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882332.47243: done with get_vars() 18699 1726882332.47253: done getting variables 18699 1726882332.47519: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 21:32:12 -0400 (0:00:00.390) 0:00:06.071 ****** 18699 1726882332.47551: entering _queue_task() for managed_node1/set_fact 18699 1726882332.48009: worker is 1 (out of 1 available) 18699 1726882332.48020: exiting _queue_task() for managed_node1/set_fact 18699 1726882332.48032: done queuing things up, now waiting for results queue to drain 18699 1726882332.48033: waiting for pending results... 18699 1726882332.48534: running TaskExecutor() for managed_node1/TASK: Set current_interfaces 18699 1726882332.48701: in run() - task 12673a56-9f93-1ce6-d207-000000000194 18699 1726882332.48706: variable 'ansible_search_path' from source: unknown 18699 1726882332.48709: variable 'ansible_search_path' from source: unknown 18699 1726882332.48712: calling self._execute() 18699 1726882332.48764: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882332.48777: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882332.48797: variable 'omit' from source: magic vars 18699 1726882332.49146: variable 'ansible_distribution_major_version' from source: facts 18699 1726882332.49161: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882332.49175: variable 'omit' from source: magic vars 18699 1726882332.49313: variable 'omit' from source: magic vars 18699 1726882332.49503: variable '_current_interfaces' from source: set_fact 18699 1726882332.49506: variable 'omit' from source: magic vars 18699 1726882332.49546: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18699 1726882332.49585: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18699 1726882332.49623: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18699 1726882332.49646: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882332.49663: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882332.49700: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18699 1726882332.49709: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882332.49725: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882332.49833: Set connection var ansible_connection to ssh 18699 1726882332.49847: Set connection var ansible_pipelining to False 18699 1726882332.49898: Set connection var ansible_shell_executable to /bin/sh 18699 1726882332.49901: Set connection var ansible_timeout to 10 18699 1726882332.49904: Set connection var ansible_shell_type to sh 18699 1726882332.49906: Set connection var ansible_module_compression to ZIP_DEFLATED 18699 1726882332.49910: variable 'ansible_shell_executable' from source: unknown 18699 1726882332.49918: variable 'ansible_connection' from source: unknown 18699 1726882332.49925: variable 'ansible_module_compression' from source: unknown 18699 1726882332.49938: variable 'ansible_shell_type' from source: unknown 18699 1726882332.49945: variable 'ansible_shell_executable' from source: unknown 18699 1726882332.49952: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882332.49959: variable 'ansible_pipelining' from source: unknown 18699 1726882332.49966: variable 'ansible_timeout' from source: unknown 18699 1726882332.50046: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882332.50131: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18699 1726882332.50150: variable 'omit' from source: magic vars 18699 1726882332.50267: starting attempt loop 18699 1726882332.50271: running the handler 18699 1726882332.50273: handler run complete 18699 1726882332.50275: attempt loop complete, returning result 18699 1726882332.50277: _execute() done 18699 1726882332.50279: dumping result to json 18699 1726882332.50281: done dumping result, returning 18699 1726882332.50284: done running TaskExecutor() for managed_node1/TASK: Set current_interfaces [12673a56-9f93-1ce6-d207-000000000194] 18699 1726882332.50286: sending task result for task 12673a56-9f93-1ce6-d207-000000000194 18699 1726882332.50350: done sending task result for task 12673a56-9f93-1ce6-d207-000000000194 18699 1726882332.50353: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 18699 1726882332.50430: no more pending results, returning what we have 18699 1726882332.50434: results queue empty 18699 1726882332.50435: checking for any_errors_fatal 18699 1726882332.50443: done checking for any_errors_fatal 18699 1726882332.50444: checking for max_fail_percentage 18699 1726882332.50445: done checking for max_fail_percentage 18699 1726882332.50446: checking to see if all hosts have failed and the running result is not ok 18699 1726882332.50447: done checking to see if all hosts have failed 18699 1726882332.50448: getting the remaining hosts for this loop 18699 1726882332.50449: done getting the remaining hosts for this loop 18699 1726882332.50453: getting the next task for host managed_node1 18699 1726882332.50462: done getting next task for host managed_node1 18699 1726882332.50465: ^ task is: TASK: Show current_interfaces 18699 1726882332.50469: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882332.50474: getting variables 18699 1726882332.50475: in VariableManager get_vars() 18699 1726882332.50507: Calling all_inventory to load vars for managed_node1 18699 1726882332.50510: Calling groups_inventory to load vars for managed_node1 18699 1726882332.50513: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882332.50525: Calling all_plugins_play to load vars for managed_node1 18699 1726882332.50527: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882332.50530: Calling groups_plugins_play to load vars for managed_node1 18699 1726882332.50926: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882332.51118: done with get_vars() 18699 1726882332.51133: done getting variables 18699 1726882332.51189: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 21:32:12 -0400 (0:00:00.036) 0:00:06.108 ****** 18699 1726882332.51222: entering _queue_task() for managed_node1/debug 18699 1726882332.51469: worker is 1 (out of 1 available) 18699 1726882332.51480: exiting _queue_task() for managed_node1/debug 18699 1726882332.51490: done queuing things up, now waiting for results queue to drain 18699 1726882332.51491: waiting for pending results... 18699 1726882332.51803: running TaskExecutor() for managed_node1/TASK: Show current_interfaces 18699 1726882332.51814: in run() - task 12673a56-9f93-1ce6-d207-00000000015d 18699 1726882332.51833: variable 'ansible_search_path' from source: unknown 18699 1726882332.51840: variable 'ansible_search_path' from source: unknown 18699 1726882332.51879: calling self._execute() 18699 1726882332.51961: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882332.51972: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882332.51989: variable 'omit' from source: magic vars 18699 1726882332.52343: variable 'ansible_distribution_major_version' from source: facts 18699 1726882332.52359: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882332.52370: variable 'omit' from source: magic vars 18699 1726882332.52419: variable 'omit' from source: magic vars 18699 1726882332.52519: variable 'current_interfaces' from source: set_fact 18699 1726882332.52555: variable 'omit' from source: magic vars 18699 1726882332.52660: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18699 1726882332.52664: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18699 1726882332.52667: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18699 1726882332.52684: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882332.52703: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882332.52738: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18699 1726882332.52747: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882332.52755: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882332.52858: Set connection var ansible_connection to ssh 18699 1726882332.52874: Set connection var ansible_pipelining to False 18699 1726882332.52889: Set connection var ansible_shell_executable to /bin/sh 18699 1726882332.52903: Set connection var ansible_timeout to 10 18699 1726882332.52989: Set connection var ansible_shell_type to sh 18699 1726882332.52994: Set connection var ansible_module_compression to ZIP_DEFLATED 18699 1726882332.52997: variable 'ansible_shell_executable' from source: unknown 18699 1726882332.52999: variable 'ansible_connection' from source: unknown 18699 1726882332.53001: variable 'ansible_module_compression' from source: unknown 18699 1726882332.53004: variable 'ansible_shell_type' from source: unknown 18699 1726882332.53006: variable 'ansible_shell_executable' from source: unknown 18699 1726882332.53008: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882332.53010: variable 'ansible_pipelining' from source: unknown 18699 1726882332.53012: variable 'ansible_timeout' from source: unknown 18699 1726882332.53014: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882332.53132: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18699 1726882332.53149: variable 'omit' from source: magic vars 18699 1726882332.53160: starting attempt loop 18699 1726882332.53166: running the handler 18699 1726882332.53222: handler run complete 18699 1726882332.53241: attempt loop complete, returning result 18699 1726882332.53249: _execute() done 18699 1726882332.53257: dumping result to json 18699 1726882332.53264: done dumping result, returning 18699 1726882332.53274: done running TaskExecutor() for managed_node1/TASK: Show current_interfaces [12673a56-9f93-1ce6-d207-00000000015d] 18699 1726882332.53283: sending task result for task 12673a56-9f93-1ce6-d207-00000000015d 18699 1726882332.53485: done sending task result for task 12673a56-9f93-1ce6-d207-00000000015d 18699 1726882332.53489: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 18699 1726882332.53566: no more pending results, returning what we have 18699 1726882332.53570: results queue empty 18699 1726882332.53570: checking for any_errors_fatal 18699 1726882332.53574: done checking for any_errors_fatal 18699 1726882332.53575: checking for max_fail_percentage 18699 1726882332.53576: done checking for max_fail_percentage 18699 1726882332.53577: checking to see if all hosts have failed and the running result is not ok 18699 1726882332.53578: done checking to see if all hosts have failed 18699 1726882332.53579: getting the remaining hosts for this loop 18699 1726882332.53580: done getting the remaining hosts for this loop 18699 1726882332.53583: getting the next task for host managed_node1 18699 1726882332.53590: done getting next task for host managed_node1 18699 1726882332.53592: ^ task is: TASK: Install iproute 18699 1726882332.53598: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882332.53601: getting variables 18699 1726882332.53602: in VariableManager get_vars() 18699 1726882332.53628: Calling all_inventory to load vars for managed_node1 18699 1726882332.53631: Calling groups_inventory to load vars for managed_node1 18699 1726882332.53634: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882332.53644: Calling all_plugins_play to load vars for managed_node1 18699 1726882332.53647: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882332.53650: Calling groups_plugins_play to load vars for managed_node1 18699 1726882332.53857: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882332.54148: done with get_vars() 18699 1726882332.54156: done getting variables 18699 1726882332.54207: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Friday 20 September 2024 21:32:12 -0400 (0:00:00.030) 0:00:06.138 ****** 18699 1726882332.54232: entering _queue_task() for managed_node1/package 18699 1726882332.54466: worker is 1 (out of 1 available) 18699 1726882332.54479: exiting _queue_task() for managed_node1/package 18699 1726882332.54490: done queuing things up, now waiting for results queue to drain 18699 1726882332.54492: waiting for pending results... 18699 1726882332.54735: running TaskExecutor() for managed_node1/TASK: Install iproute 18699 1726882332.54834: in run() - task 12673a56-9f93-1ce6-d207-000000000134 18699 1726882332.54854: variable 'ansible_search_path' from source: unknown 18699 1726882332.54862: variable 'ansible_search_path' from source: unknown 18699 1726882332.54916: calling self._execute() 18699 1726882332.54981: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882332.55024: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882332.55028: variable 'omit' from source: magic vars 18699 1726882332.55357: variable 'ansible_distribution_major_version' from source: facts 18699 1726882332.55374: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882332.55385: variable 'omit' from source: magic vars 18699 1726882332.55424: variable 'omit' from source: magic vars 18699 1726882332.55678: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18699 1726882332.57900: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18699 1726882332.57987: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18699 1726882332.58033: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18699 1726882332.58078: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18699 1726882332.58112: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18699 1726882332.58213: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882332.58247: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882332.58283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882332.58329: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882332.58382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882332.58458: variable '__network_is_ostree' from source: set_fact 18699 1726882332.58470: variable 'omit' from source: magic vars 18699 1726882332.58510: variable 'omit' from source: magic vars 18699 1726882332.58544: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18699 1726882332.58576: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18699 1726882332.58698: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18699 1726882332.58704: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882332.58707: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882332.58709: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18699 1726882332.58711: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882332.58713: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882332.58782: Set connection var ansible_connection to ssh 18699 1726882332.58800: Set connection var ansible_pipelining to False 18699 1726882332.58813: Set connection var ansible_shell_executable to /bin/sh 18699 1726882332.58830: Set connection var ansible_timeout to 10 18699 1726882332.58838: Set connection var ansible_shell_type to sh 18699 1726882332.58848: Set connection var ansible_module_compression to ZIP_DEFLATED 18699 1726882332.58883: variable 'ansible_shell_executable' from source: unknown 18699 1726882332.58892: variable 'ansible_connection' from source: unknown 18699 1726882332.58903: variable 'ansible_module_compression' from source: unknown 18699 1726882332.58911: variable 'ansible_shell_type' from source: unknown 18699 1726882332.58919: variable 'ansible_shell_executable' from source: unknown 18699 1726882332.58935: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882332.58938: variable 'ansible_pipelining' from source: unknown 18699 1726882332.59044: variable 'ansible_timeout' from source: unknown 18699 1726882332.59048: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882332.59064: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18699 1726882332.59080: variable 'omit' from source: magic vars 18699 1726882332.59090: starting attempt loop 18699 1726882332.59101: running the handler 18699 1726882332.59113: variable 'ansible_facts' from source: unknown 18699 1726882332.59120: variable 'ansible_facts' from source: unknown 18699 1726882332.59160: _low_level_execute_command(): starting 18699 1726882332.59172: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18699 1726882332.59877: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882332.59898: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882332.59922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882332.60028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882332.60073: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882332.60115: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882332.61672: stdout chunk (state=3): >>>/root <<< 18699 1726882332.61799: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882332.61811: stdout chunk (state=3): >>><<< 18699 1726882332.61823: stderr chunk (state=3): >>><<< 18699 1726882332.61906: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882332.61917: _low_level_execute_command(): starting 18699 1726882332.61920: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882332.618412-19020-225984317133070 `" && echo ansible-tmp-1726882332.618412-19020-225984317133070="` echo /root/.ansible/tmp/ansible-tmp-1726882332.618412-19020-225984317133070 `" ) && sleep 0' 18699 1726882332.62271: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882332.62290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 18699 1726882332.62303: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882332.62343: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882332.62360: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882332.62408: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882332.64263: stdout chunk (state=3): >>>ansible-tmp-1726882332.618412-19020-225984317133070=/root/.ansible/tmp/ansible-tmp-1726882332.618412-19020-225984317133070 <<< 18699 1726882332.64366: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882332.64390: stderr chunk (state=3): >>><<< 18699 1726882332.64396: stdout chunk (state=3): >>><<< 18699 1726882332.64413: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882332.618412-19020-225984317133070=/root/.ansible/tmp/ansible-tmp-1726882332.618412-19020-225984317133070 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882332.64440: variable 'ansible_module_compression' from source: unknown 18699 1726882332.64489: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 18699 1726882332.64495: ANSIBALLZ: Acquiring lock 18699 1726882332.64497: ANSIBALLZ: Lock acquired: 140254445799856 18699 1726882332.64500: ANSIBALLZ: Creating module 18699 1726882332.79571: ANSIBALLZ: Writing module into payload 18699 1726882332.79924: ANSIBALLZ: Writing module 18699 1726882332.79928: ANSIBALLZ: Renaming module 18699 1726882332.79930: ANSIBALLZ: Done creating module 18699 1726882332.79932: variable 'ansible_facts' from source: unknown 18699 1726882332.79946: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882332.618412-19020-225984317133070/AnsiballZ_dnf.py 18699 1726882332.80122: Sending initial data 18699 1726882332.80125: Sent initial data (151 bytes) 18699 1726882332.80724: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882332.80733: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882332.80744: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882332.80773: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882332.80858: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882332.80882: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882332.80899: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882332.80968: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882332.82524: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 18699 1726882332.82535: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18699 1726882332.82576: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18699 1726882332.82637: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmpc36aus7e /root/.ansible/tmp/ansible-tmp-1726882332.618412-19020-225984317133070/AnsiballZ_dnf.py <<< 18699 1726882332.82641: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882332.618412-19020-225984317133070/AnsiballZ_dnf.py" <<< 18699 1726882332.82686: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmpc36aus7e" to remote "/root/.ansible/tmp/ansible-tmp-1726882332.618412-19020-225984317133070/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882332.618412-19020-225984317133070/AnsiballZ_dnf.py" <<< 18699 1726882332.83582: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882332.83620: stderr chunk (state=3): >>><<< 18699 1726882332.83630: stdout chunk (state=3): >>><<< 18699 1726882332.83660: done transferring module to remote 18699 1726882332.83678: _low_level_execute_command(): starting 18699 1726882332.83758: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882332.618412-19020-225984317133070/ /root/.ansible/tmp/ansible-tmp-1726882332.618412-19020-225984317133070/AnsiballZ_dnf.py && sleep 0' 18699 1726882332.84399: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882332.84402: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882332.84404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882332.84406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18699 1726882332.84408: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 18699 1726882332.84410: stderr chunk (state=3): >>>debug2: match not found <<< 18699 1726882332.84412: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882332.84414: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18699 1726882332.84416: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 18699 1726882332.84418: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 18699 1726882332.84420: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882332.84422: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882332.84424: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18699 1726882332.84426: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 18699 1726882332.84428: stderr chunk (state=3): >>>debug2: match found <<< 18699 1726882332.84430: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882332.84439: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882332.84450: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882332.84471: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882332.84532: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882332.86301: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882332.86310: stderr chunk (state=3): >>><<< 18699 1726882332.86313: stdout chunk (state=3): >>><<< 18699 1726882332.86392: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882332.86397: _low_level_execute_command(): starting 18699 1726882332.86400: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882332.618412-19020-225984317133070/AnsiballZ_dnf.py && sleep 0' 18699 1726882332.86942: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882332.86948: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18699 1726882332.86965: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 18699 1726882332.86971: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882332.87054: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 18699 1726882332.87058: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 18699 1726882332.87060: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882332.87068: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882332.87071: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882332.87085: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882332.87164: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882333.27523: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 18699 1726882333.31343: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 18699 1726882333.31513: stderr chunk (state=3): >>><<< 18699 1726882333.31517: stdout chunk (state=3): >>><<< 18699 1726882333.31538: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 18699 1726882333.31581: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882332.618412-19020-225984317133070/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18699 1726882333.31589: _low_level_execute_command(): starting 18699 1726882333.31599: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882332.618412-19020-225984317133070/ > /dev/null 2>&1 && sleep 0' 18699 1726882333.32647: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882333.32653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 18699 1726882333.32655: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882333.32657: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882333.32659: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882333.32776: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882333.33089: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882333.33266: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882333.35146: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882333.35150: stdout chunk (state=3): >>><<< 18699 1726882333.35205: stderr chunk (state=3): >>><<< 18699 1726882333.35209: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882333.35211: handler run complete 18699 1726882333.35459: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18699 1726882333.35848: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18699 1726882333.36074: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18699 1726882333.36078: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18699 1726882333.36080: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18699 1726882333.36229: variable '__install_status' from source: unknown 18699 1726882333.36316: Evaluated conditional (__install_status is success): True 18699 1726882333.36400: attempt loop complete, returning result 18699 1726882333.36403: _execute() done 18699 1726882333.36405: dumping result to json 18699 1726882333.36406: done dumping result, returning 18699 1726882333.36416: done running TaskExecutor() for managed_node1/TASK: Install iproute [12673a56-9f93-1ce6-d207-000000000134] 18699 1726882333.36422: sending task result for task 12673a56-9f93-1ce6-d207-000000000134 ok: [managed_node1] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 18699 1726882333.36770: no more pending results, returning what we have 18699 1726882333.36773: results queue empty 18699 1726882333.36774: checking for any_errors_fatal 18699 1726882333.36778: done checking for any_errors_fatal 18699 1726882333.36779: checking for max_fail_percentage 18699 1726882333.36781: done checking for max_fail_percentage 18699 1726882333.36782: checking to see if all hosts have failed and the running result is not ok 18699 1726882333.36782: done checking to see if all hosts have failed 18699 1726882333.36783: getting the remaining hosts for this loop 18699 1726882333.36784: done getting the remaining hosts for this loop 18699 1726882333.36788: getting the next task for host managed_node1 18699 1726882333.36797: done getting next task for host managed_node1 18699 1726882333.36800: ^ task is: TASK: Create veth interface {{ interface }} 18699 1726882333.36803: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882333.36806: getting variables 18699 1726882333.36808: in VariableManager get_vars() 18699 1726882333.36838: Calling all_inventory to load vars for managed_node1 18699 1726882333.36841: Calling groups_inventory to load vars for managed_node1 18699 1726882333.36845: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882333.36856: Calling all_plugins_play to load vars for managed_node1 18699 1726882333.36859: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882333.36862: Calling groups_plugins_play to load vars for managed_node1 18699 1726882333.37457: done sending task result for task 12673a56-9f93-1ce6-d207-000000000134 18699 1726882333.37462: WORKER PROCESS EXITING 18699 1726882333.37477: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882333.37911: done with get_vars() 18699 1726882333.37923: done getting variables 18699 1726882333.37985: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 18699 1726882333.38351: variable 'interface' from source: set_fact TASK [Create veth interface lsr27] ********************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Friday 20 September 2024 21:32:13 -0400 (0:00:00.841) 0:00:06.979 ****** 18699 1726882333.38383: entering _queue_task() for managed_node1/command 18699 1726882333.39032: worker is 1 (out of 1 available) 18699 1726882333.39046: exiting _queue_task() for managed_node1/command 18699 1726882333.39060: done queuing things up, now waiting for results queue to drain 18699 1726882333.39061: waiting for pending results... 18699 1726882333.39446: running TaskExecutor() for managed_node1/TASK: Create veth interface lsr27 18699 1726882333.39542: in run() - task 12673a56-9f93-1ce6-d207-000000000135 18699 1726882333.39624: variable 'ansible_search_path' from source: unknown 18699 1726882333.39700: variable 'ansible_search_path' from source: unknown 18699 1726882333.40199: variable 'interface' from source: set_fact 18699 1726882333.40284: variable 'interface' from source: set_fact 18699 1726882333.40567: variable 'interface' from source: set_fact 18699 1726882333.40713: Loaded config def from plugin (lookup/items) 18699 1726882333.40999: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 18699 1726882333.41002: variable 'omit' from source: magic vars 18699 1726882333.41048: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882333.41211: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882333.41228: variable 'omit' from source: magic vars 18699 1726882333.42323: variable 'ansible_distribution_major_version' from source: facts 18699 1726882333.42337: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882333.42748: variable 'type' from source: set_fact 18699 1726882333.42759: variable 'state' from source: include params 18699 1726882333.42768: variable 'interface' from source: set_fact 18699 1726882333.42776: variable 'current_interfaces' from source: set_fact 18699 1726882333.43198: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 18699 1726882333.43202: variable 'omit' from source: magic vars 18699 1726882333.43205: variable 'omit' from source: magic vars 18699 1726882333.43207: variable 'item' from source: unknown 18699 1726882333.43209: variable 'item' from source: unknown 18699 1726882333.43211: variable 'omit' from source: magic vars 18699 1726882333.43235: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18699 1726882333.43428: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18699 1726882333.43453: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18699 1726882333.43476: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882333.43495: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882333.43531: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18699 1726882333.43540: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882333.43548: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882333.43998: Set connection var ansible_connection to ssh 18699 1726882333.44001: Set connection var ansible_pipelining to False 18699 1726882333.44004: Set connection var ansible_shell_executable to /bin/sh 18699 1726882333.44006: Set connection var ansible_timeout to 10 18699 1726882333.44008: Set connection var ansible_shell_type to sh 18699 1726882333.44010: Set connection var ansible_module_compression to ZIP_DEFLATED 18699 1726882333.44012: variable 'ansible_shell_executable' from source: unknown 18699 1726882333.44014: variable 'ansible_connection' from source: unknown 18699 1726882333.44016: variable 'ansible_module_compression' from source: unknown 18699 1726882333.44018: variable 'ansible_shell_type' from source: unknown 18699 1726882333.44020: variable 'ansible_shell_executable' from source: unknown 18699 1726882333.44023: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882333.44025: variable 'ansible_pipelining' from source: unknown 18699 1726882333.44027: variable 'ansible_timeout' from source: unknown 18699 1726882333.44029: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882333.44153: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18699 1726882333.44409: variable 'omit' from source: magic vars 18699 1726882333.44419: starting attempt loop 18699 1726882333.44427: running the handler 18699 1726882333.44445: _low_level_execute_command(): starting 18699 1726882333.44457: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18699 1726882333.45429: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882333.45442: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882333.45456: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882333.45476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18699 1726882333.45496: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 18699 1726882333.45510: stderr chunk (state=3): >>>debug2: match not found <<< 18699 1726882333.45525: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882333.45543: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18699 1726882333.45608: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882333.45640: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882333.45664: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882333.45680: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882333.45908: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882333.47343: stdout chunk (state=3): >>>/root <<< 18699 1726882333.47516: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882333.47519: stdout chunk (state=3): >>><<< 18699 1726882333.47522: stderr chunk (state=3): >>><<< 18699 1726882333.47541: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882333.47566: _low_level_execute_command(): starting 18699 1726882333.47577: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882333.4755323-19047-96064889023840 `" && echo ansible-tmp-1726882333.4755323-19047-96064889023840="` echo /root/.ansible/tmp/ansible-tmp-1726882333.4755323-19047-96064889023840 `" ) && sleep 0' 18699 1726882333.48584: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882333.48799: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882333.48922: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882333.48998: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882333.50849: stdout chunk (state=3): >>>ansible-tmp-1726882333.4755323-19047-96064889023840=/root/.ansible/tmp/ansible-tmp-1726882333.4755323-19047-96064889023840 <<< 18699 1726882333.50998: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882333.51012: stdout chunk (state=3): >>><<< 18699 1726882333.51028: stderr chunk (state=3): >>><<< 18699 1726882333.51049: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882333.4755323-19047-96064889023840=/root/.ansible/tmp/ansible-tmp-1726882333.4755323-19047-96064889023840 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882333.51086: variable 'ansible_module_compression' from source: unknown 18699 1726882333.51176: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18699f6i6z5dg/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 18699 1726882333.51339: variable 'ansible_facts' from source: unknown 18699 1726882333.51588: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882333.4755323-19047-96064889023840/AnsiballZ_command.py 18699 1726882333.52435: Sending initial data 18699 1726882333.52438: Sent initial data (155 bytes) 18699 1726882333.53672: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882333.53676: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882333.54011: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882333.54041: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882333.54171: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882333.55682: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18699 1726882333.55750: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18699 1726882333.55790: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmpvyynlg5m /root/.ansible/tmp/ansible-tmp-1726882333.4755323-19047-96064889023840/AnsiballZ_command.py <<< 18699 1726882333.55798: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882333.4755323-19047-96064889023840/AnsiballZ_command.py" <<< 18699 1726882333.55863: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmpvyynlg5m" to remote "/root/.ansible/tmp/ansible-tmp-1726882333.4755323-19047-96064889023840/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882333.4755323-19047-96064889023840/AnsiballZ_command.py" <<< 18699 1726882333.57228: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882333.57283: stderr chunk (state=3): >>><<< 18699 1726882333.57291: stdout chunk (state=3): >>><<< 18699 1726882333.57326: done transferring module to remote 18699 1726882333.57341: _low_level_execute_command(): starting 18699 1726882333.57350: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882333.4755323-19047-96064889023840/ /root/.ansible/tmp/ansible-tmp-1726882333.4755323-19047-96064889023840/AnsiballZ_command.py && sleep 0' 18699 1726882333.58515: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882333.58798: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882333.58825: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882333.58862: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882333.58902: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882333.60856: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882333.60868: stdout chunk (state=3): >>><<< 18699 1726882333.60878: stderr chunk (state=3): >>><<< 18699 1726882333.60900: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882333.60908: _low_level_execute_command(): starting 18699 1726882333.60917: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882333.4755323-19047-96064889023840/AnsiballZ_command.py && sleep 0' 18699 1726882333.62053: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882333.62089: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882333.62107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882333.62201: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882333.62244: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882333.62266: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882333.62284: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882333.62371: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882333.78106: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "lsr27", "type", "veth", "peer", "name", "peerlsr27"], "start": "2024-09-20 21:32:13.773309", "end": "2024-09-20 21:32:13.778900", "delta": "0:00:00.005591", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add lsr27 type veth peer name peerlsr27", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 18699 1726882333.80286: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 18699 1726882333.80323: stderr chunk (state=3): >>><<< 18699 1726882333.80326: stdout chunk (state=3): >>><<< 18699 1726882333.80346: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "lsr27", "type", "veth", "peer", "name", "peerlsr27"], "start": "2024-09-20 21:32:13.773309", "end": "2024-09-20 21:32:13.778900", "delta": "0:00:00.005591", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add lsr27 type veth peer name peerlsr27", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 18699 1726882333.80379: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link add lsr27 type veth peer name peerlsr27', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882333.4755323-19047-96064889023840/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18699 1726882333.80389: _low_level_execute_command(): starting 18699 1726882333.80392: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882333.4755323-19047-96064889023840/ > /dev/null 2>&1 && sleep 0' 18699 1726882333.80833: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882333.80836: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 18699 1726882333.80839: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882333.80841: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882333.80847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882333.80900: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882333.80907: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882333.80952: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882333.85843: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882333.85866: stderr chunk (state=3): >>><<< 18699 1726882333.85869: stdout chunk (state=3): >>><<< 18699 1726882333.85884: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882333.85889: handler run complete 18699 1726882333.85910: Evaluated conditional (False): False 18699 1726882333.85918: attempt loop complete, returning result 18699 1726882333.85934: variable 'item' from source: unknown 18699 1726882333.86000: variable 'item' from source: unknown ok: [managed_node1] => (item=ip link add lsr27 type veth peer name peerlsr27) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "add", "lsr27", "type", "veth", "peer", "name", "peerlsr27" ], "delta": "0:00:00.005591", "end": "2024-09-20 21:32:13.778900", "item": "ip link add lsr27 type veth peer name peerlsr27", "rc": 0, "start": "2024-09-20 21:32:13.773309" } 18699 1726882333.86159: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882333.86162: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882333.86164: variable 'omit' from source: magic vars 18699 1726882333.86237: variable 'ansible_distribution_major_version' from source: facts 18699 1726882333.86241: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882333.86358: variable 'type' from source: set_fact 18699 1726882333.86362: variable 'state' from source: include params 18699 1726882333.86365: variable 'interface' from source: set_fact 18699 1726882333.86369: variable 'current_interfaces' from source: set_fact 18699 1726882333.86375: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 18699 1726882333.86379: variable 'omit' from source: magic vars 18699 1726882333.86397: variable 'omit' from source: magic vars 18699 1726882333.86426: variable 'item' from source: unknown 18699 1726882333.86468: variable 'item' from source: unknown 18699 1726882333.86479: variable 'omit' from source: magic vars 18699 1726882333.86501: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18699 1726882333.86504: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882333.86512: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882333.86522: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18699 1726882333.86525: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882333.86527: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882333.86572: Set connection var ansible_connection to ssh 18699 1726882333.86575: Set connection var ansible_pipelining to False 18699 1726882333.86581: Set connection var ansible_shell_executable to /bin/sh 18699 1726882333.86586: Set connection var ansible_timeout to 10 18699 1726882333.86588: Set connection var ansible_shell_type to sh 18699 1726882333.86597: Set connection var ansible_module_compression to ZIP_DEFLATED 18699 1726882333.86616: variable 'ansible_shell_executable' from source: unknown 18699 1726882333.86619: variable 'ansible_connection' from source: unknown 18699 1726882333.86621: variable 'ansible_module_compression' from source: unknown 18699 1726882333.86624: variable 'ansible_shell_type' from source: unknown 18699 1726882333.86626: variable 'ansible_shell_executable' from source: unknown 18699 1726882333.86628: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882333.86630: variable 'ansible_pipelining' from source: unknown 18699 1726882333.86632: variable 'ansible_timeout' from source: unknown 18699 1726882333.86639: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882333.86700: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18699 1726882333.86707: variable 'omit' from source: magic vars 18699 1726882333.86713: starting attempt loop 18699 1726882333.86715: running the handler 18699 1726882333.86721: _low_level_execute_command(): starting 18699 1726882333.86725: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18699 1726882333.87204: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882333.87210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882333.87258: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882333.87284: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882333.87325: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882333.88921: stdout chunk (state=3): >>>/root <<< 18699 1726882333.89000: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882333.89022: stderr chunk (state=3): >>><<< 18699 1726882333.89026: stdout chunk (state=3): >>><<< 18699 1726882333.89038: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882333.89045: _low_level_execute_command(): starting 18699 1726882333.89050: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882333.8903725-19047-222885231079681 `" && echo ansible-tmp-1726882333.8903725-19047-222885231079681="` echo /root/.ansible/tmp/ansible-tmp-1726882333.8903725-19047-222885231079681 `" ) && sleep 0' 18699 1726882333.89459: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882333.89462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 18699 1726882333.89465: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 18699 1726882333.89467: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18699 1726882333.89469: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882333.89521: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882333.89530: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882333.89533: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882333.89566: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882333.91415: stdout chunk (state=3): >>>ansible-tmp-1726882333.8903725-19047-222885231079681=/root/.ansible/tmp/ansible-tmp-1726882333.8903725-19047-222885231079681 <<< 18699 1726882333.91522: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882333.91542: stderr chunk (state=3): >>><<< 18699 1726882333.91545: stdout chunk (state=3): >>><<< 18699 1726882333.91556: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882333.8903725-19047-222885231079681=/root/.ansible/tmp/ansible-tmp-1726882333.8903725-19047-222885231079681 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882333.91578: variable 'ansible_module_compression' from source: unknown 18699 1726882333.91611: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18699f6i6z5dg/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 18699 1726882333.91627: variable 'ansible_facts' from source: unknown 18699 1726882333.91668: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882333.8903725-19047-222885231079681/AnsiballZ_command.py 18699 1726882333.91755: Sending initial data 18699 1726882333.91759: Sent initial data (156 bytes) 18699 1726882333.92167: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882333.92170: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 18699 1726882333.92173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882333.92175: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882333.92177: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882333.92230: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882333.92238: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882333.92275: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882333.93786: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 18699 1726882333.93791: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18699 1726882333.93825: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18699 1726882333.93869: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmpjzi3aayg /root/.ansible/tmp/ansible-tmp-1726882333.8903725-19047-222885231079681/AnsiballZ_command.py <<< 18699 1726882333.93872: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882333.8903725-19047-222885231079681/AnsiballZ_command.py" <<< 18699 1726882333.93911: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmpjzi3aayg" to remote "/root/.ansible/tmp/ansible-tmp-1726882333.8903725-19047-222885231079681/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882333.8903725-19047-222885231079681/AnsiballZ_command.py" <<< 18699 1726882333.94433: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882333.94466: stderr chunk (state=3): >>><<< 18699 1726882333.94470: stdout chunk (state=3): >>><<< 18699 1726882333.94497: done transferring module to remote 18699 1726882333.94508: _low_level_execute_command(): starting 18699 1726882333.94511: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882333.8903725-19047-222885231079681/ /root/.ansible/tmp/ansible-tmp-1726882333.8903725-19047-222885231079681/AnsiballZ_command.py && sleep 0' 18699 1726882333.94913: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882333.94916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 18699 1726882333.94918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 18699 1726882333.94920: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 18699 1726882333.94922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882333.94968: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882333.94971: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882333.95020: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882333.96712: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882333.96733: stderr chunk (state=3): >>><<< 18699 1726882333.96736: stdout chunk (state=3): >>><<< 18699 1726882333.96748: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882333.96750: _low_level_execute_command(): starting 18699 1726882333.96755: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882333.8903725-19047-222885231079681/AnsiballZ_command.py && sleep 0' 18699 1726882333.97139: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882333.97143: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882333.97153: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882333.97201: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882333.97224: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882333.97257: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882334.12709: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerlsr27", "up"], "start": "2024-09-20 21:32:14.122355", "end": "2024-09-20 21:32:14.125755", "delta": "0:00:00.003400", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerlsr27 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 18699 1726882334.14223: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 18699 1726882334.14248: stderr chunk (state=3): >>><<< 18699 1726882334.14265: stdout chunk (state=3): >>><<< 18699 1726882334.14310: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerlsr27", "up"], "start": "2024-09-20 21:32:14.122355", "end": "2024-09-20 21:32:14.125755", "delta": "0:00:00.003400", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerlsr27 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 18699 1726882334.14415: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set peerlsr27 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882333.8903725-19047-222885231079681/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18699 1726882334.14419: _low_level_execute_command(): starting 18699 1726882334.14423: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882333.8903725-19047-222885231079681/ > /dev/null 2>&1 && sleep 0' 18699 1726882334.15013: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882334.15105: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882334.15139: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882334.15154: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882334.15176: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882334.15250: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882334.17081: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882334.17109: stdout chunk (state=3): >>><<< 18699 1726882334.17112: stderr chunk (state=3): >>><<< 18699 1726882334.17200: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882334.17203: handler run complete 18699 1726882334.17206: Evaluated conditional (False): False 18699 1726882334.17208: attempt loop complete, returning result 18699 1726882334.17210: variable 'item' from source: unknown 18699 1726882334.17282: variable 'item' from source: unknown ok: [managed_node1] => (item=ip link set peerlsr27 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "peerlsr27", "up" ], "delta": "0:00:00.003400", "end": "2024-09-20 21:32:14.125755", "item": "ip link set peerlsr27 up", "rc": 0, "start": "2024-09-20 21:32:14.122355" } 18699 1726882334.17710: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882334.17713: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882334.17715: variable 'omit' from source: magic vars 18699 1726882334.17718: variable 'ansible_distribution_major_version' from source: facts 18699 1726882334.17720: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882334.17883: variable 'type' from source: set_fact 18699 1726882334.17892: variable 'state' from source: include params 18699 1726882334.17903: variable 'interface' from source: set_fact 18699 1726882334.17911: variable 'current_interfaces' from source: set_fact 18699 1726882334.17925: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 18699 1726882334.17933: variable 'omit' from source: magic vars 18699 1726882334.17957: variable 'omit' from source: magic vars 18699 1726882334.18001: variable 'item' from source: unknown 18699 1726882334.18075: variable 'item' from source: unknown 18699 1726882334.18096: variable 'omit' from source: magic vars 18699 1726882334.18124: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18699 1726882334.18143: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882334.18198: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882334.18201: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18699 1726882334.18203: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882334.18205: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882334.18275: Set connection var ansible_connection to ssh 18699 1726882334.18287: Set connection var ansible_pipelining to False 18699 1726882334.18300: Set connection var ansible_shell_executable to /bin/sh 18699 1726882334.18310: Set connection var ansible_timeout to 10 18699 1726882334.18316: Set connection var ansible_shell_type to sh 18699 1726882334.18322: Set connection var ansible_module_compression to ZIP_DEFLATED 18699 1726882334.18347: variable 'ansible_shell_executable' from source: unknown 18699 1726882334.18360: variable 'ansible_connection' from source: unknown 18699 1726882334.18362: variable 'ansible_module_compression' from source: unknown 18699 1726882334.18373: variable 'ansible_shell_type' from source: unknown 18699 1726882334.18375: variable 'ansible_shell_executable' from source: unknown 18699 1726882334.18397: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882334.18400: variable 'ansible_pipelining' from source: unknown 18699 1726882334.18402: variable 'ansible_timeout' from source: unknown 18699 1726882334.18403: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882334.18484: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18699 1726882334.18580: variable 'omit' from source: magic vars 18699 1726882334.18591: starting attempt loop 18699 1726882334.18595: running the handler 18699 1726882334.18602: _low_level_execute_command(): starting 18699 1726882334.18605: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18699 1726882334.19210: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882334.19254: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18699 1726882334.19344: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882334.19360: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882334.19382: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882334.19400: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882334.19427: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882334.19507: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882334.21075: stdout chunk (state=3): >>>/root <<< 18699 1726882334.21230: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882334.21234: stdout chunk (state=3): >>><<< 18699 1726882334.21236: stderr chunk (state=3): >>><<< 18699 1726882334.21343: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882334.21347: _low_level_execute_command(): starting 18699 1726882334.21349: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882334.2125998-19047-219114847340396 `" && echo ansible-tmp-1726882334.2125998-19047-219114847340396="` echo /root/.ansible/tmp/ansible-tmp-1726882334.2125998-19047-219114847340396 `" ) && sleep 0' 18699 1726882334.22425: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882334.22440: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882334.24306: stdout chunk (state=3): >>>ansible-tmp-1726882334.2125998-19047-219114847340396=/root/.ansible/tmp/ansible-tmp-1726882334.2125998-19047-219114847340396 <<< 18699 1726882334.24456: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882334.24464: stdout chunk (state=3): >>><<< 18699 1726882334.24473: stderr chunk (state=3): >>><<< 18699 1726882334.24498: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882334.2125998-19047-219114847340396=/root/.ansible/tmp/ansible-tmp-1726882334.2125998-19047-219114847340396 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882334.24522: variable 'ansible_module_compression' from source: unknown 18699 1726882334.24558: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18699f6i6z5dg/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 18699 1726882334.24579: variable 'ansible_facts' from source: unknown 18699 1726882334.24653: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882334.2125998-19047-219114847340396/AnsiballZ_command.py 18699 1726882334.24920: Sending initial data 18699 1726882334.24936: Sent initial data (156 bytes) 18699 1726882334.26804: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882334.26850: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882334.26964: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882334.27122: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882334.28659: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18699 1726882334.28703: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18699 1726882334.28742: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmplpt98z4k /root/.ansible/tmp/ansible-tmp-1726882334.2125998-19047-219114847340396/AnsiballZ_command.py <<< 18699 1726882334.28751: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882334.2125998-19047-219114847340396/AnsiballZ_command.py" <<< 18699 1726882334.28784: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmplpt98z4k" to remote "/root/.ansible/tmp/ansible-tmp-1726882334.2125998-19047-219114847340396/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882334.2125998-19047-219114847340396/AnsiballZ_command.py" <<< 18699 1726882334.30015: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882334.30025: stdout chunk (state=3): >>><<< 18699 1726882334.30208: stderr chunk (state=3): >>><<< 18699 1726882334.30211: done transferring module to remote 18699 1726882334.30214: _low_level_execute_command(): starting 18699 1726882334.30216: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882334.2125998-19047-219114847340396/ /root/.ansible/tmp/ansible-tmp-1726882334.2125998-19047-219114847340396/AnsiballZ_command.py && sleep 0' 18699 1726882334.31301: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882334.31323: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882334.31326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882334.31554: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882334.31619: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882334.31673: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882334.33381: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882334.33416: stderr chunk (state=3): >>><<< 18699 1726882334.33431: stdout chunk (state=3): >>><<< 18699 1726882334.33454: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882334.33522: _low_level_execute_command(): starting 18699 1726882334.33538: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882334.2125998-19047-219114847340396/AnsiballZ_command.py && sleep 0' 18699 1726882334.34871: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882334.34907: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882334.35005: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882334.35050: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882334.50385: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "lsr27", "up"], "start": "2024-09-20 21:32:14.498669", "end": "2024-09-20 21:32:14.502050", "delta": "0:00:00.003381", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set lsr27 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 18699 1726882334.51811: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 18699 1726882334.51823: stdout chunk (state=3): >>><<< 18699 1726882334.51835: stderr chunk (state=3): >>><<< 18699 1726882334.51857: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "lsr27", "up"], "start": "2024-09-20 21:32:14.498669", "end": "2024-09-20 21:32:14.502050", "delta": "0:00:00.003381", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set lsr27 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 18699 1726882334.51978: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set lsr27 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882334.2125998-19047-219114847340396/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18699 1726882334.51982: _low_level_execute_command(): starting 18699 1726882334.51984: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882334.2125998-19047-219114847340396/ > /dev/null 2>&1 && sleep 0' 18699 1726882334.52607: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882334.52611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882334.52670: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882334.52695: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882334.52716: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882334.52784: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882334.54587: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882334.54619: stdout chunk (state=3): >>><<< 18699 1726882334.54622: stderr chunk (state=3): >>><<< 18699 1726882334.54638: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882334.54799: handler run complete 18699 1726882334.54803: Evaluated conditional (False): False 18699 1726882334.54805: attempt loop complete, returning result 18699 1726882334.54807: variable 'item' from source: unknown 18699 1726882334.54809: variable 'item' from source: unknown ok: [managed_node1] => (item=ip link set lsr27 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "lsr27", "up" ], "delta": "0:00:00.003381", "end": "2024-09-20 21:32:14.502050", "item": "ip link set lsr27 up", "rc": 0, "start": "2024-09-20 21:32:14.498669" } 18699 1726882334.55038: dumping result to json 18699 1726882334.55041: done dumping result, returning 18699 1726882334.55043: done running TaskExecutor() for managed_node1/TASK: Create veth interface lsr27 [12673a56-9f93-1ce6-d207-000000000135] 18699 1726882334.55045: sending task result for task 12673a56-9f93-1ce6-d207-000000000135 18699 1726882334.55751: done sending task result for task 12673a56-9f93-1ce6-d207-000000000135 18699 1726882334.55754: WORKER PROCESS EXITING 18699 1726882334.55916: no more pending results, returning what we have 18699 1726882334.55920: results queue empty 18699 1726882334.55921: checking for any_errors_fatal 18699 1726882334.55924: done checking for any_errors_fatal 18699 1726882334.55925: checking for max_fail_percentage 18699 1726882334.55926: done checking for max_fail_percentage 18699 1726882334.55927: checking to see if all hosts have failed and the running result is not ok 18699 1726882334.55928: done checking to see if all hosts have failed 18699 1726882334.55929: getting the remaining hosts for this loop 18699 1726882334.55930: done getting the remaining hosts for this loop 18699 1726882334.55933: getting the next task for host managed_node1 18699 1726882334.55937: done getting next task for host managed_node1 18699 1726882334.55940: ^ task is: TASK: Set up veth as managed by NetworkManager 18699 1726882334.55942: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882334.55945: getting variables 18699 1726882334.55946: in VariableManager get_vars() 18699 1726882334.55968: Calling all_inventory to load vars for managed_node1 18699 1726882334.55971: Calling groups_inventory to load vars for managed_node1 18699 1726882334.55974: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882334.55983: Calling all_plugins_play to load vars for managed_node1 18699 1726882334.55986: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882334.55989: Calling groups_plugins_play to load vars for managed_node1 18699 1726882334.56151: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882334.56355: done with get_vars() 18699 1726882334.56365: done getting variables 18699 1726882334.56434: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Friday 20 September 2024 21:32:14 -0400 (0:00:01.180) 0:00:08.160 ****** 18699 1726882334.56465: entering _queue_task() for managed_node1/command 18699 1726882334.57020: worker is 1 (out of 1 available) 18699 1726882334.57028: exiting _queue_task() for managed_node1/command 18699 1726882334.57037: done queuing things up, now waiting for results queue to drain 18699 1726882334.57037: waiting for pending results... 18699 1726882334.57109: running TaskExecutor() for managed_node1/TASK: Set up veth as managed by NetworkManager 18699 1726882334.57265: in run() - task 12673a56-9f93-1ce6-d207-000000000136 18699 1726882334.57272: variable 'ansible_search_path' from source: unknown 18699 1726882334.57275: variable 'ansible_search_path' from source: unknown 18699 1726882334.57279: calling self._execute() 18699 1726882334.57363: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882334.57385: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882334.57402: variable 'omit' from source: magic vars 18699 1726882334.57780: variable 'ansible_distribution_major_version' from source: facts 18699 1726882334.57790: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882334.57908: variable 'type' from source: set_fact 18699 1726882334.57912: variable 'state' from source: include params 18699 1726882334.57922: Evaluated conditional (type == 'veth' and state == 'present'): True 18699 1726882334.57925: variable 'omit' from source: magic vars 18699 1726882334.57950: variable 'omit' from source: magic vars 18699 1726882334.58019: variable 'interface' from source: set_fact 18699 1726882334.58037: variable 'omit' from source: magic vars 18699 1726882334.58069: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18699 1726882334.58100: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18699 1726882334.58117: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18699 1726882334.58129: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882334.58141: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882334.58165: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18699 1726882334.58169: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882334.58171: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882334.58252: Set connection var ansible_connection to ssh 18699 1726882334.58256: Set connection var ansible_pipelining to False 18699 1726882334.58358: Set connection var ansible_shell_executable to /bin/sh 18699 1726882334.58362: Set connection var ansible_timeout to 10 18699 1726882334.58365: Set connection var ansible_shell_type to sh 18699 1726882334.58367: Set connection var ansible_module_compression to ZIP_DEFLATED 18699 1726882334.58369: variable 'ansible_shell_executable' from source: unknown 18699 1726882334.58371: variable 'ansible_connection' from source: unknown 18699 1726882334.58373: variable 'ansible_module_compression' from source: unknown 18699 1726882334.58375: variable 'ansible_shell_type' from source: unknown 18699 1726882334.58377: variable 'ansible_shell_executable' from source: unknown 18699 1726882334.58379: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882334.58381: variable 'ansible_pipelining' from source: unknown 18699 1726882334.58383: variable 'ansible_timeout' from source: unknown 18699 1726882334.58386: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882334.58563: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18699 1726882334.58586: variable 'omit' from source: magic vars 18699 1726882334.58604: starting attempt loop 18699 1726882334.58612: running the handler 18699 1726882334.58632: _low_level_execute_command(): starting 18699 1726882334.58686: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18699 1726882334.59506: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 18699 1726882334.59526: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882334.59546: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18699 1726882334.59611: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882334.59679: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882334.59709: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882334.59728: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882334.59832: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882334.61364: stdout chunk (state=3): >>>/root <<< 18699 1726882334.61531: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882334.61535: stdout chunk (state=3): >>><<< 18699 1726882334.61538: stderr chunk (state=3): >>><<< 18699 1726882334.61611: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882334.61615: _low_level_execute_command(): starting 18699 1726882334.61618: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882334.6157012-19110-32782668767046 `" && echo ansible-tmp-1726882334.6157012-19110-32782668767046="` echo /root/.ansible/tmp/ansible-tmp-1726882334.6157012-19110-32782668767046 `" ) && sleep 0' 18699 1726882334.62220: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882334.62318: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882334.62354: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882334.62382: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882334.62455: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882334.64296: stdout chunk (state=3): >>>ansible-tmp-1726882334.6157012-19110-32782668767046=/root/.ansible/tmp/ansible-tmp-1726882334.6157012-19110-32782668767046 <<< 18699 1726882334.64402: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882334.64428: stderr chunk (state=3): >>><<< 18699 1726882334.64431: stdout chunk (state=3): >>><<< 18699 1726882334.64441: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882334.6157012-19110-32782668767046=/root/.ansible/tmp/ansible-tmp-1726882334.6157012-19110-32782668767046 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882334.64491: variable 'ansible_module_compression' from source: unknown 18699 1726882334.64511: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18699f6i6z5dg/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 18699 1726882334.64539: variable 'ansible_facts' from source: unknown 18699 1726882334.64590: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882334.6157012-19110-32782668767046/AnsiballZ_command.py 18699 1726882334.64687: Sending initial data 18699 1726882334.64691: Sent initial data (155 bytes) 18699 1726882334.65140: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 18699 1726882334.65144: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882334.65214: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882334.65296: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882334.66823: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18699 1726882334.66861: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18699 1726882334.66924: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmp0j_auflf /root/.ansible/tmp/ansible-tmp-1726882334.6157012-19110-32782668767046/AnsiballZ_command.py <<< 18699 1726882334.66927: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882334.6157012-19110-32782668767046/AnsiballZ_command.py" <<< 18699 1726882334.66952: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmp0j_auflf" to remote "/root/.ansible/tmp/ansible-tmp-1726882334.6157012-19110-32782668767046/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882334.6157012-19110-32782668767046/AnsiballZ_command.py" <<< 18699 1726882334.67476: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882334.67509: stderr chunk (state=3): >>><<< 18699 1726882334.67512: stdout chunk (state=3): >>><<< 18699 1726882334.67528: done transferring module to remote 18699 1726882334.67540: _low_level_execute_command(): starting 18699 1726882334.67543: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882334.6157012-19110-32782668767046/ /root/.ansible/tmp/ansible-tmp-1726882334.6157012-19110-32782668767046/AnsiballZ_command.py && sleep 0' 18699 1726882334.67921: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882334.67926: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882334.67945: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882334.67989: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882334.67998: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882334.68039: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882334.69814: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882334.69817: stdout chunk (state=3): >>><<< 18699 1726882334.69819: stderr chunk (state=3): >>><<< 18699 1726882334.69902: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882334.69909: _low_level_execute_command(): starting 18699 1726882334.69912: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882334.6157012-19110-32782668767046/AnsiballZ_command.py && sleep 0' 18699 1726882334.70410: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882334.70423: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882334.70437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882334.70457: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18699 1726882334.70472: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 18699 1726882334.70563: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882334.70580: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882334.70600: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882334.70618: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882334.70691: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882334.87334: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "lsr27", "managed", "true"], "start": "2024-09-20 21:32:14.854103", "end": "2024-09-20 21:32:14.872206", "delta": "0:00:00.018103", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set lsr27 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 18699 1726882334.88813: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 18699 1726882334.88817: stderr chunk (state=3): >>><<< 18699 1726882334.88819: stdout chunk (state=3): >>><<< 18699 1726882334.88955: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "lsr27", "managed", "true"], "start": "2024-09-20 21:32:14.854103", "end": "2024-09-20 21:32:14.872206", "delta": "0:00:00.018103", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set lsr27 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 18699 1726882334.88958: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli d set lsr27 managed true', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882334.6157012-19110-32782668767046/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18699 1726882334.88960: _low_level_execute_command(): starting 18699 1726882334.88962: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882334.6157012-19110-32782668767046/ > /dev/null 2>&1 && sleep 0' 18699 1726882334.89611: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882334.89661: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882334.89680: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882334.89713: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882334.89776: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882334.91539: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882334.91578: stderr chunk (state=3): >>><<< 18699 1726882334.91581: stdout chunk (state=3): >>><<< 18699 1726882334.91603: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882334.91611: handler run complete 18699 1726882334.91654: Evaluated conditional (False): False 18699 1726882334.91658: attempt loop complete, returning result 18699 1726882334.91660: _execute() done 18699 1726882334.91662: dumping result to json 18699 1726882334.91664: done dumping result, returning 18699 1726882334.91666: done running TaskExecutor() for managed_node1/TASK: Set up veth as managed by NetworkManager [12673a56-9f93-1ce6-d207-000000000136] 18699 1726882334.91668: sending task result for task 12673a56-9f93-1ce6-d207-000000000136 ok: [managed_node1] => { "changed": false, "cmd": [ "nmcli", "d", "set", "lsr27", "managed", "true" ], "delta": "0:00:00.018103", "end": "2024-09-20 21:32:14.872206", "rc": 0, "start": "2024-09-20 21:32:14.854103" } 18699 1726882334.92106: no more pending results, returning what we have 18699 1726882334.92109: results queue empty 18699 1726882334.92109: checking for any_errors_fatal 18699 1726882334.92125: done checking for any_errors_fatal 18699 1726882334.92126: checking for max_fail_percentage 18699 1726882334.92128: done checking for max_fail_percentage 18699 1726882334.92129: checking to see if all hosts have failed and the running result is not ok 18699 1726882334.92129: done checking to see if all hosts have failed 18699 1726882334.92130: getting the remaining hosts for this loop 18699 1726882334.92131: done getting the remaining hosts for this loop 18699 1726882334.92135: getting the next task for host managed_node1 18699 1726882334.92140: done getting next task for host managed_node1 18699 1726882334.92142: ^ task is: TASK: Delete veth interface {{ interface }} 18699 1726882334.92145: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882334.92150: getting variables 18699 1726882334.92151: in VariableManager get_vars() 18699 1726882334.92177: Calling all_inventory to load vars for managed_node1 18699 1726882334.92179: Calling groups_inventory to load vars for managed_node1 18699 1726882334.92182: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882334.92191: Calling all_plugins_play to load vars for managed_node1 18699 1726882334.92198: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882334.92201: Calling groups_plugins_play to load vars for managed_node1 18699 1726882334.92356: done sending task result for task 12673a56-9f93-1ce6-d207-000000000136 18699 1726882334.92360: WORKER PROCESS EXITING 18699 1726882334.92373: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882334.92561: done with get_vars() 18699 1726882334.92571: done getting variables 18699 1726882334.92629: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 18699 1726882334.92739: variable 'interface' from source: set_fact TASK [Delete veth interface lsr27] ********************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Friday 20 September 2024 21:32:14 -0400 (0:00:00.363) 0:00:08.523 ****** 18699 1726882334.92770: entering _queue_task() for managed_node1/command 18699 1726882334.93015: worker is 1 (out of 1 available) 18699 1726882334.93025: exiting _queue_task() for managed_node1/command 18699 1726882334.93039: done queuing things up, now waiting for results queue to drain 18699 1726882334.93040: waiting for pending results... 18699 1726882334.93294: running TaskExecutor() for managed_node1/TASK: Delete veth interface lsr27 18699 1726882334.93400: in run() - task 12673a56-9f93-1ce6-d207-000000000137 18699 1726882334.93498: variable 'ansible_search_path' from source: unknown 18699 1726882334.93502: variable 'ansible_search_path' from source: unknown 18699 1726882334.93505: calling self._execute() 18699 1726882334.93553: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882334.93556: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882334.93565: variable 'omit' from source: magic vars 18699 1726882334.93853: variable 'ansible_distribution_major_version' from source: facts 18699 1726882334.93866: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882334.94001: variable 'type' from source: set_fact 18699 1726882334.94006: variable 'state' from source: include params 18699 1726882334.94009: variable 'interface' from source: set_fact 18699 1726882334.94014: variable 'current_interfaces' from source: set_fact 18699 1726882334.94021: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): False 18699 1726882334.94025: when evaluation is False, skipping this task 18699 1726882334.94027: _execute() done 18699 1726882334.94030: dumping result to json 18699 1726882334.94032: done dumping result, returning 18699 1726882334.94038: done running TaskExecutor() for managed_node1/TASK: Delete veth interface lsr27 [12673a56-9f93-1ce6-d207-000000000137] 18699 1726882334.94043: sending task result for task 12673a56-9f93-1ce6-d207-000000000137 18699 1726882334.94119: done sending task result for task 12673a56-9f93-1ce6-d207-000000000137 18699 1726882334.94122: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'veth' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 18699 1726882334.94201: no more pending results, returning what we have 18699 1726882334.94204: results queue empty 18699 1726882334.94205: checking for any_errors_fatal 18699 1726882334.94212: done checking for any_errors_fatal 18699 1726882334.94213: checking for max_fail_percentage 18699 1726882334.94215: done checking for max_fail_percentage 18699 1726882334.94215: checking to see if all hosts have failed and the running result is not ok 18699 1726882334.94216: done checking to see if all hosts have failed 18699 1726882334.94217: getting the remaining hosts for this loop 18699 1726882334.94218: done getting the remaining hosts for this loop 18699 1726882334.94221: getting the next task for host managed_node1 18699 1726882334.94226: done getting next task for host managed_node1 18699 1726882334.94227: ^ task is: TASK: Create dummy interface {{ interface }} 18699 1726882334.94230: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882334.94234: getting variables 18699 1726882334.94236: in VariableManager get_vars() 18699 1726882334.94257: Calling all_inventory to load vars for managed_node1 18699 1726882334.94259: Calling groups_inventory to load vars for managed_node1 18699 1726882334.94261: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882334.94269: Calling all_plugins_play to load vars for managed_node1 18699 1726882334.94271: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882334.94273: Calling groups_plugins_play to load vars for managed_node1 18699 1726882334.94380: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882334.94492: done with get_vars() 18699 1726882334.94501: done getting variables 18699 1726882334.94539: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 18699 1726882334.94615: variable 'interface' from source: set_fact TASK [Create dummy interface lsr27] ******************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Friday 20 September 2024 21:32:14 -0400 (0:00:00.018) 0:00:08.542 ****** 18699 1726882334.94634: entering _queue_task() for managed_node1/command 18699 1726882334.94804: worker is 1 (out of 1 available) 18699 1726882334.94816: exiting _queue_task() for managed_node1/command 18699 1726882334.94827: done queuing things up, now waiting for results queue to drain 18699 1726882334.94828: waiting for pending results... 18699 1726882334.94967: running TaskExecutor() for managed_node1/TASK: Create dummy interface lsr27 18699 1726882334.95024: in run() - task 12673a56-9f93-1ce6-d207-000000000138 18699 1726882334.95036: variable 'ansible_search_path' from source: unknown 18699 1726882334.95040: variable 'ansible_search_path' from source: unknown 18699 1726882334.95067: calling self._execute() 18699 1726882334.95127: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882334.95130: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882334.95138: variable 'omit' from source: magic vars 18699 1726882334.95601: variable 'ansible_distribution_major_version' from source: facts 18699 1726882334.95605: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882334.95632: variable 'type' from source: set_fact 18699 1726882334.95643: variable 'state' from source: include params 18699 1726882334.95651: variable 'interface' from source: set_fact 18699 1726882334.95659: variable 'current_interfaces' from source: set_fact 18699 1726882334.95671: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 18699 1726882334.95678: when evaluation is False, skipping this task 18699 1726882334.95684: _execute() done 18699 1726882334.95690: dumping result to json 18699 1726882334.95703: done dumping result, returning 18699 1726882334.95715: done running TaskExecutor() for managed_node1/TASK: Create dummy interface lsr27 [12673a56-9f93-1ce6-d207-000000000138] 18699 1726882334.95725: sending task result for task 12673a56-9f93-1ce6-d207-000000000138 skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 18699 1726882334.95872: no more pending results, returning what we have 18699 1726882334.95876: results queue empty 18699 1726882334.95877: checking for any_errors_fatal 18699 1726882334.95881: done checking for any_errors_fatal 18699 1726882334.95882: checking for max_fail_percentage 18699 1726882334.95883: done checking for max_fail_percentage 18699 1726882334.95884: checking to see if all hosts have failed and the running result is not ok 18699 1726882334.95885: done checking to see if all hosts have failed 18699 1726882334.95886: getting the remaining hosts for this loop 18699 1726882334.95887: done getting the remaining hosts for this loop 18699 1726882334.95890: getting the next task for host managed_node1 18699 1726882334.95901: done getting next task for host managed_node1 18699 1726882334.95904: ^ task is: TASK: Delete dummy interface {{ interface }} 18699 1726882334.95907: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882334.95912: getting variables 18699 1726882334.95914: in VariableManager get_vars() 18699 1726882334.95942: Calling all_inventory to load vars for managed_node1 18699 1726882334.95945: Calling groups_inventory to load vars for managed_node1 18699 1726882334.95949: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882334.95961: Calling all_plugins_play to load vars for managed_node1 18699 1726882334.95964: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882334.95966: Calling groups_plugins_play to load vars for managed_node1 18699 1726882334.96444: done sending task result for task 12673a56-9f93-1ce6-d207-000000000138 18699 1726882334.96447: WORKER PROCESS EXITING 18699 1726882334.96457: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882334.96569: done with get_vars() 18699 1726882334.96579: done getting variables 18699 1726882334.96625: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 18699 1726882334.96702: variable 'interface' from source: set_fact TASK [Delete dummy interface lsr27] ******************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Friday 20 September 2024 21:32:14 -0400 (0:00:00.020) 0:00:08.563 ****** 18699 1726882334.96723: entering _queue_task() for managed_node1/command 18699 1726882334.96911: worker is 1 (out of 1 available) 18699 1726882334.96924: exiting _queue_task() for managed_node1/command 18699 1726882334.96936: done queuing things up, now waiting for results queue to drain 18699 1726882334.96937: waiting for pending results... 18699 1726882334.97085: running TaskExecutor() for managed_node1/TASK: Delete dummy interface lsr27 18699 1726882334.97149: in run() - task 12673a56-9f93-1ce6-d207-000000000139 18699 1726882334.97167: variable 'ansible_search_path' from source: unknown 18699 1726882334.97172: variable 'ansible_search_path' from source: unknown 18699 1726882334.97197: calling self._execute() 18699 1726882334.97253: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882334.97256: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882334.97266: variable 'omit' from source: magic vars 18699 1726882334.97525: variable 'ansible_distribution_major_version' from source: facts 18699 1726882334.97533: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882334.97660: variable 'type' from source: set_fact 18699 1726882334.97928: variable 'state' from source: include params 18699 1726882334.97931: variable 'interface' from source: set_fact 18699 1726882334.97933: variable 'current_interfaces' from source: set_fact 18699 1726882334.97936: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 18699 1726882334.97938: when evaluation is False, skipping this task 18699 1726882334.97940: _execute() done 18699 1726882334.97942: dumping result to json 18699 1726882334.97944: done dumping result, returning 18699 1726882334.97946: done running TaskExecutor() for managed_node1/TASK: Delete dummy interface lsr27 [12673a56-9f93-1ce6-d207-000000000139] 18699 1726882334.97947: sending task result for task 12673a56-9f93-1ce6-d207-000000000139 18699 1726882334.98012: done sending task result for task 12673a56-9f93-1ce6-d207-000000000139 18699 1726882334.98016: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 18699 1726882334.98066: no more pending results, returning what we have 18699 1726882334.98069: results queue empty 18699 1726882334.98070: checking for any_errors_fatal 18699 1726882334.98074: done checking for any_errors_fatal 18699 1726882334.98075: checking for max_fail_percentage 18699 1726882334.98077: done checking for max_fail_percentage 18699 1726882334.98078: checking to see if all hosts have failed and the running result is not ok 18699 1726882334.98078: done checking to see if all hosts have failed 18699 1726882334.98079: getting the remaining hosts for this loop 18699 1726882334.98080: done getting the remaining hosts for this loop 18699 1726882334.98084: getting the next task for host managed_node1 18699 1726882334.98088: done getting next task for host managed_node1 18699 1726882334.98090: ^ task is: TASK: Create tap interface {{ interface }} 18699 1726882334.98098: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882334.98101: getting variables 18699 1726882334.98103: in VariableManager get_vars() 18699 1726882334.98126: Calling all_inventory to load vars for managed_node1 18699 1726882334.98128: Calling groups_inventory to load vars for managed_node1 18699 1726882334.98131: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882334.98155: Calling all_plugins_play to load vars for managed_node1 18699 1726882334.98158: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882334.98161: Calling groups_plugins_play to load vars for managed_node1 18699 1726882334.98339: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882334.98601: done with get_vars() 18699 1726882334.98608: done getting variables 18699 1726882334.98664: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 18699 1726882334.98744: variable 'interface' from source: set_fact TASK [Create tap interface lsr27] ********************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Friday 20 September 2024 21:32:14 -0400 (0:00:00.020) 0:00:08.583 ****** 18699 1726882334.98768: entering _queue_task() for managed_node1/command 18699 1726882334.98947: worker is 1 (out of 1 available) 18699 1726882334.98958: exiting _queue_task() for managed_node1/command 18699 1726882334.98968: done queuing things up, now waiting for results queue to drain 18699 1726882334.98969: waiting for pending results... 18699 1726882334.99123: running TaskExecutor() for managed_node1/TASK: Create tap interface lsr27 18699 1726882334.99181: in run() - task 12673a56-9f93-1ce6-d207-00000000013a 18699 1726882334.99200: variable 'ansible_search_path' from source: unknown 18699 1726882334.99204: variable 'ansible_search_path' from source: unknown 18699 1726882334.99230: calling self._execute() 18699 1726882334.99288: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882334.99291: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882334.99307: variable 'omit' from source: magic vars 18699 1726882334.99550: variable 'ansible_distribution_major_version' from source: facts 18699 1726882334.99559: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882334.99689: variable 'type' from source: set_fact 18699 1726882334.99694: variable 'state' from source: include params 18699 1726882334.99701: variable 'interface' from source: set_fact 18699 1726882334.99704: variable 'current_interfaces' from source: set_fact 18699 1726882334.99712: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 18699 1726882334.99714: when evaluation is False, skipping this task 18699 1726882334.99717: _execute() done 18699 1726882334.99720: dumping result to json 18699 1726882334.99722: done dumping result, returning 18699 1726882334.99729: done running TaskExecutor() for managed_node1/TASK: Create tap interface lsr27 [12673a56-9f93-1ce6-d207-00000000013a] 18699 1726882334.99732: sending task result for task 12673a56-9f93-1ce6-d207-00000000013a 18699 1726882334.99813: done sending task result for task 12673a56-9f93-1ce6-d207-00000000013a 18699 1726882334.99816: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 18699 1726882334.99879: no more pending results, returning what we have 18699 1726882334.99882: results queue empty 18699 1726882334.99883: checking for any_errors_fatal 18699 1726882334.99887: done checking for any_errors_fatal 18699 1726882334.99888: checking for max_fail_percentage 18699 1726882334.99889: done checking for max_fail_percentage 18699 1726882334.99889: checking to see if all hosts have failed and the running result is not ok 18699 1726882334.99890: done checking to see if all hosts have failed 18699 1726882334.99891: getting the remaining hosts for this loop 18699 1726882334.99892: done getting the remaining hosts for this loop 18699 1726882334.99900: getting the next task for host managed_node1 18699 1726882334.99905: done getting next task for host managed_node1 18699 1726882334.99907: ^ task is: TASK: Delete tap interface {{ interface }} 18699 1726882334.99909: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882334.99912: getting variables 18699 1726882334.99913: in VariableManager get_vars() 18699 1726882334.99936: Calling all_inventory to load vars for managed_node1 18699 1726882334.99938: Calling groups_inventory to load vars for managed_node1 18699 1726882334.99940: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882334.99946: Calling all_plugins_play to load vars for managed_node1 18699 1726882334.99948: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882334.99949: Calling groups_plugins_play to load vars for managed_node1 18699 1726882335.00085: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882335.00198: done with get_vars() 18699 1726882335.00204: done getting variables 18699 1726882335.00243: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 18699 1726882335.00314: variable 'interface' from source: set_fact TASK [Delete tap interface lsr27] ********************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Friday 20 September 2024 21:32:15 -0400 (0:00:00.015) 0:00:08.599 ****** 18699 1726882335.00334: entering _queue_task() for managed_node1/command 18699 1726882335.00531: worker is 1 (out of 1 available) 18699 1726882335.00543: exiting _queue_task() for managed_node1/command 18699 1726882335.00554: done queuing things up, now waiting for results queue to drain 18699 1726882335.00555: waiting for pending results... 18699 1726882335.01008: running TaskExecutor() for managed_node1/TASK: Delete tap interface lsr27 18699 1726882335.01011: in run() - task 12673a56-9f93-1ce6-d207-00000000013b 18699 1726882335.01014: variable 'ansible_search_path' from source: unknown 18699 1726882335.01016: variable 'ansible_search_path' from source: unknown 18699 1726882335.01018: calling self._execute() 18699 1726882335.01020: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882335.01022: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882335.01024: variable 'omit' from source: magic vars 18699 1726882335.01373: variable 'ansible_distribution_major_version' from source: facts 18699 1726882335.01387: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882335.01575: variable 'type' from source: set_fact 18699 1726882335.01585: variable 'state' from source: include params 18699 1726882335.01595: variable 'interface' from source: set_fact 18699 1726882335.01603: variable 'current_interfaces' from source: set_fact 18699 1726882335.01614: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 18699 1726882335.01620: when evaluation is False, skipping this task 18699 1726882335.01627: _execute() done 18699 1726882335.01634: dumping result to json 18699 1726882335.01641: done dumping result, returning 18699 1726882335.01650: done running TaskExecutor() for managed_node1/TASK: Delete tap interface lsr27 [12673a56-9f93-1ce6-d207-00000000013b] 18699 1726882335.01657: sending task result for task 12673a56-9f93-1ce6-d207-00000000013b skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 18699 1726882335.01834: no more pending results, returning what we have 18699 1726882335.01838: results queue empty 18699 1726882335.01839: checking for any_errors_fatal 18699 1726882335.01847: done checking for any_errors_fatal 18699 1726882335.01847: checking for max_fail_percentage 18699 1726882335.01849: done checking for max_fail_percentage 18699 1726882335.01850: checking to see if all hosts have failed and the running result is not ok 18699 1726882335.01851: done checking to see if all hosts have failed 18699 1726882335.01851: getting the remaining hosts for this loop 18699 1726882335.01853: done getting the remaining hosts for this loop 18699 1726882335.01856: getting the next task for host managed_node1 18699 1726882335.01864: done getting next task for host managed_node1 18699 1726882335.01868: ^ task is: TASK: Include the task 'assert_device_present.yml' 18699 1726882335.01870: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882335.01874: getting variables 18699 1726882335.01875: in VariableManager get_vars() 18699 1726882335.01905: Calling all_inventory to load vars for managed_node1 18699 1726882335.01908: Calling groups_inventory to load vars for managed_node1 18699 1726882335.01911: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882335.01923: Calling all_plugins_play to load vars for managed_node1 18699 1726882335.01925: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882335.01928: Calling groups_plugins_play to load vars for managed_node1 18699 1726882335.02247: done sending task result for task 12673a56-9f93-1ce6-d207-00000000013b 18699 1726882335.02251: WORKER PROCESS EXITING 18699 1726882335.02274: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882335.02464: done with get_vars() 18699 1726882335.02473: done getting variables TASK [Include the task 'assert_device_present.yml'] **************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:30 Friday 20 September 2024 21:32:15 -0400 (0:00:00.022) 0:00:08.621 ****** 18699 1726882335.02556: entering _queue_task() for managed_node1/include_tasks 18699 1726882335.02758: worker is 1 (out of 1 available) 18699 1726882335.02770: exiting _queue_task() for managed_node1/include_tasks 18699 1726882335.02780: done queuing things up, now waiting for results queue to drain 18699 1726882335.02781: waiting for pending results... 18699 1726882335.03012: running TaskExecutor() for managed_node1/TASK: Include the task 'assert_device_present.yml' 18699 1726882335.03091: in run() - task 12673a56-9f93-1ce6-d207-000000000012 18699 1726882335.03115: variable 'ansible_search_path' from source: unknown 18699 1726882335.03152: calling self._execute() 18699 1726882335.03234: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882335.03245: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882335.03258: variable 'omit' from source: magic vars 18699 1726882335.03587: variable 'ansible_distribution_major_version' from source: facts 18699 1726882335.03604: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882335.03615: _execute() done 18699 1726882335.03623: dumping result to json 18699 1726882335.03631: done dumping result, returning 18699 1726882335.03640: done running TaskExecutor() for managed_node1/TASK: Include the task 'assert_device_present.yml' [12673a56-9f93-1ce6-d207-000000000012] 18699 1726882335.03652: sending task result for task 12673a56-9f93-1ce6-d207-000000000012 18699 1726882335.03777: no more pending results, returning what we have 18699 1726882335.03782: in VariableManager get_vars() 18699 1726882335.03813: Calling all_inventory to load vars for managed_node1 18699 1726882335.03817: Calling groups_inventory to load vars for managed_node1 18699 1726882335.03820: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882335.03833: Calling all_plugins_play to load vars for managed_node1 18699 1726882335.03836: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882335.03838: Calling groups_plugins_play to load vars for managed_node1 18699 1726882335.04228: done sending task result for task 12673a56-9f93-1ce6-d207-000000000012 18699 1726882335.04231: WORKER PROCESS EXITING 18699 1726882335.04251: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882335.04437: done with get_vars() 18699 1726882335.04444: variable 'ansible_search_path' from source: unknown 18699 1726882335.04455: we have included files to process 18699 1726882335.04456: generating all_blocks data 18699 1726882335.04457: done generating all_blocks data 18699 1726882335.04460: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 18699 1726882335.04462: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 18699 1726882335.04464: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 18699 1726882335.04610: in VariableManager get_vars() 18699 1726882335.04625: done with get_vars() 18699 1726882335.04748: done processing included file 18699 1726882335.04750: iterating over new_blocks loaded from include file 18699 1726882335.04751: in VariableManager get_vars() 18699 1726882335.04761: done with get_vars() 18699 1726882335.04762: filtering new block on tags 18699 1726882335.04778: done filtering new block on tags 18699 1726882335.04780: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed_node1 18699 1726882335.04784: extending task lists for all hosts with included blocks 18699 1726882335.05297: done extending task lists 18699 1726882335.05299: done processing included files 18699 1726882335.05300: results queue empty 18699 1726882335.05300: checking for any_errors_fatal 18699 1726882335.05303: done checking for any_errors_fatal 18699 1726882335.05304: checking for max_fail_percentage 18699 1726882335.05305: done checking for max_fail_percentage 18699 1726882335.05306: checking to see if all hosts have failed and the running result is not ok 18699 1726882335.05306: done checking to see if all hosts have failed 18699 1726882335.05307: getting the remaining hosts for this loop 18699 1726882335.05308: done getting the remaining hosts for this loop 18699 1726882335.05311: getting the next task for host managed_node1 18699 1726882335.05315: done getting next task for host managed_node1 18699 1726882335.05317: ^ task is: TASK: Include the task 'get_interface_stat.yml' 18699 1726882335.05319: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882335.05321: getting variables 18699 1726882335.05322: in VariableManager get_vars() 18699 1726882335.05330: Calling all_inventory to load vars for managed_node1 18699 1726882335.05332: Calling groups_inventory to load vars for managed_node1 18699 1726882335.05334: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882335.05339: Calling all_plugins_play to load vars for managed_node1 18699 1726882335.05341: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882335.05344: Calling groups_plugins_play to load vars for managed_node1 18699 1726882335.05505: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882335.05680: done with get_vars() 18699 1726882335.05688: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 21:32:15 -0400 (0:00:00.031) 0:00:08.653 ****** 18699 1726882335.05758: entering _queue_task() for managed_node1/include_tasks 18699 1726882335.06034: worker is 1 (out of 1 available) 18699 1726882335.06046: exiting _queue_task() for managed_node1/include_tasks 18699 1726882335.06060: done queuing things up, now waiting for results queue to drain 18699 1726882335.06061: waiting for pending results... 18699 1726882335.06260: running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' 18699 1726882335.06362: in run() - task 12673a56-9f93-1ce6-d207-0000000001d3 18699 1726882335.06385: variable 'ansible_search_path' from source: unknown 18699 1726882335.06392: variable 'ansible_search_path' from source: unknown 18699 1726882335.06434: calling self._execute() 18699 1726882335.06514: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882335.06523: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882335.06534: variable 'omit' from source: magic vars 18699 1726882335.06873: variable 'ansible_distribution_major_version' from source: facts 18699 1726882335.06888: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882335.06901: _execute() done 18699 1726882335.06908: dumping result to json 18699 1726882335.06919: done dumping result, returning 18699 1726882335.06927: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' [12673a56-9f93-1ce6-d207-0000000001d3] 18699 1726882335.06937: sending task result for task 12673a56-9f93-1ce6-d207-0000000001d3 18699 1726882335.07054: no more pending results, returning what we have 18699 1726882335.07059: in VariableManager get_vars() 18699 1726882335.07090: Calling all_inventory to load vars for managed_node1 18699 1726882335.07095: Calling groups_inventory to load vars for managed_node1 18699 1726882335.07098: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882335.07110: Calling all_plugins_play to load vars for managed_node1 18699 1726882335.07114: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882335.07116: Calling groups_plugins_play to load vars for managed_node1 18699 1726882335.07463: done sending task result for task 12673a56-9f93-1ce6-d207-0000000001d3 18699 1726882335.07466: WORKER PROCESS EXITING 18699 1726882335.07487: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882335.07675: done with get_vars() 18699 1726882335.07682: variable 'ansible_search_path' from source: unknown 18699 1726882335.07683: variable 'ansible_search_path' from source: unknown 18699 1726882335.07717: we have included files to process 18699 1726882335.07718: generating all_blocks data 18699 1726882335.07719: done generating all_blocks data 18699 1726882335.07721: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 18699 1726882335.07722: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 18699 1726882335.07724: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 18699 1726882335.07929: done processing included file 18699 1726882335.07931: iterating over new_blocks loaded from include file 18699 1726882335.07932: in VariableManager get_vars() 18699 1726882335.07943: done with get_vars() 18699 1726882335.07945: filtering new block on tags 18699 1726882335.07959: done filtering new block on tags 18699 1726882335.07961: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node1 18699 1726882335.07965: extending task lists for all hosts with included blocks 18699 1726882335.08062: done extending task lists 18699 1726882335.08064: done processing included files 18699 1726882335.08065: results queue empty 18699 1726882335.08065: checking for any_errors_fatal 18699 1726882335.08068: done checking for any_errors_fatal 18699 1726882335.08068: checking for max_fail_percentage 18699 1726882335.08069: done checking for max_fail_percentage 18699 1726882335.08070: checking to see if all hosts have failed and the running result is not ok 18699 1726882335.08071: done checking to see if all hosts have failed 18699 1726882335.08072: getting the remaining hosts for this loop 18699 1726882335.08073: done getting the remaining hosts for this loop 18699 1726882335.08075: getting the next task for host managed_node1 18699 1726882335.08079: done getting next task for host managed_node1 18699 1726882335.08081: ^ task is: TASK: Get stat for interface {{ interface }} 18699 1726882335.08083: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882335.08085: getting variables 18699 1726882335.08086: in VariableManager get_vars() 18699 1726882335.08096: Calling all_inventory to load vars for managed_node1 18699 1726882335.08098: Calling groups_inventory to load vars for managed_node1 18699 1726882335.08101: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882335.08105: Calling all_plugins_play to load vars for managed_node1 18699 1726882335.08107: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882335.08110: Calling groups_plugins_play to load vars for managed_node1 18699 1726882335.08271: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882335.08456: done with get_vars() 18699 1726882335.08464: done getting variables 18699 1726882335.08604: variable 'interface' from source: set_fact TASK [Get stat for interface lsr27] ******************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:32:15 -0400 (0:00:00.028) 0:00:08.682 ****** 18699 1726882335.08631: entering _queue_task() for managed_node1/stat 18699 1726882335.08845: worker is 1 (out of 1 available) 18699 1726882335.08855: exiting _queue_task() for managed_node1/stat 18699 1726882335.08866: done queuing things up, now waiting for results queue to drain 18699 1726882335.08867: waiting for pending results... 18699 1726882335.09096: running TaskExecutor() for managed_node1/TASK: Get stat for interface lsr27 18699 1726882335.09197: in run() - task 12673a56-9f93-1ce6-d207-00000000021e 18699 1726882335.09220: variable 'ansible_search_path' from source: unknown 18699 1726882335.09228: variable 'ansible_search_path' from source: unknown 18699 1726882335.09264: calling self._execute() 18699 1726882335.09348: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882335.09359: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882335.09371: variable 'omit' from source: magic vars 18699 1726882335.09701: variable 'ansible_distribution_major_version' from source: facts 18699 1726882335.09717: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882335.09728: variable 'omit' from source: magic vars 18699 1726882335.09780: variable 'omit' from source: magic vars 18699 1726882335.09879: variable 'interface' from source: set_fact 18699 1726882335.09903: variable 'omit' from source: magic vars 18699 1726882335.10098: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18699 1726882335.10101: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18699 1726882335.10104: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18699 1726882335.10107: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882335.10109: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882335.10111: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18699 1726882335.10113: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882335.10115: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882335.10185: Set connection var ansible_connection to ssh 18699 1726882335.10199: Set connection var ansible_pipelining to False 18699 1726882335.10209: Set connection var ansible_shell_executable to /bin/sh 18699 1726882335.10216: Set connection var ansible_timeout to 10 18699 1726882335.10221: Set connection var ansible_shell_type to sh 18699 1726882335.10232: Set connection var ansible_module_compression to ZIP_DEFLATED 18699 1726882335.10259: variable 'ansible_shell_executable' from source: unknown 18699 1726882335.10265: variable 'ansible_connection' from source: unknown 18699 1726882335.10270: variable 'ansible_module_compression' from source: unknown 18699 1726882335.10275: variable 'ansible_shell_type' from source: unknown 18699 1726882335.10280: variable 'ansible_shell_executable' from source: unknown 18699 1726882335.10285: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882335.10290: variable 'ansible_pipelining' from source: unknown 18699 1726882335.10299: variable 'ansible_timeout' from source: unknown 18699 1726882335.10306: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882335.10503: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18699 1726882335.10520: variable 'omit' from source: magic vars 18699 1726882335.10531: starting attempt loop 18699 1726882335.10539: running the handler 18699 1726882335.10562: _low_level_execute_command(): starting 18699 1726882335.10575: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18699 1726882335.11400: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882335.11419: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882335.11445: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882335.11473: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882335.11562: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882335.13218: stdout chunk (state=3): >>>/root <<< 18699 1726882335.13316: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882335.13351: stderr chunk (state=3): >>><<< 18699 1726882335.13360: stdout chunk (state=3): >>><<< 18699 1726882335.13386: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882335.13411: _low_level_execute_command(): starting 18699 1726882335.13424: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882335.1339765-19149-138065340583382 `" && echo ansible-tmp-1726882335.1339765-19149-138065340583382="` echo /root/.ansible/tmp/ansible-tmp-1726882335.1339765-19149-138065340583382 `" ) && sleep 0' 18699 1726882335.14525: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882335.14633: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882335.14814: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882335.14959: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882335.16755: stdout chunk (state=3): >>>ansible-tmp-1726882335.1339765-19149-138065340583382=/root/.ansible/tmp/ansible-tmp-1726882335.1339765-19149-138065340583382 <<< 18699 1726882335.16908: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882335.16930: stdout chunk (state=3): >>><<< 18699 1726882335.16942: stderr chunk (state=3): >>><<< 18699 1726882335.16972: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882335.1339765-19149-138065340583382=/root/.ansible/tmp/ansible-tmp-1726882335.1339765-19149-138065340583382 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882335.17033: variable 'ansible_module_compression' from source: unknown 18699 1726882335.17213: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18699f6i6z5dg/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 18699 1726882335.17270: variable 'ansible_facts' from source: unknown 18699 1726882335.17426: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882335.1339765-19149-138065340583382/AnsiballZ_stat.py 18699 1726882335.18130: Sending initial data 18699 1726882335.18213: Sent initial data (153 bytes) 18699 1726882335.18954: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882335.18990: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882335.19062: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882335.19215: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882335.20712: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 18699 1726882335.20716: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18699 1726882335.20778: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18699 1726882335.20815: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882335.1339765-19149-138065340583382/AnsiballZ_stat.py" <<< 18699 1726882335.20819: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmp67ffbbfn /root/.ansible/tmp/ansible-tmp-1726882335.1339765-19149-138065340583382/AnsiballZ_stat.py <<< 18699 1726882335.20912: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmp67ffbbfn" to remote "/root/.ansible/tmp/ansible-tmp-1726882335.1339765-19149-138065340583382/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882335.1339765-19149-138065340583382/AnsiballZ_stat.py" <<< 18699 1726882335.22025: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882335.22071: stderr chunk (state=3): >>><<< 18699 1726882335.22235: stdout chunk (state=3): >>><<< 18699 1726882335.22238: done transferring module to remote 18699 1726882335.22241: _low_level_execute_command(): starting 18699 1726882335.22243: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882335.1339765-19149-138065340583382/ /root/.ansible/tmp/ansible-tmp-1726882335.1339765-19149-138065340583382/AnsiballZ_stat.py && sleep 0' 18699 1726882335.23662: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882335.23677: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882335.23789: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882335.25516: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882335.25571: stderr chunk (state=3): >>><<< 18699 1726882335.25612: stdout chunk (state=3): >>><<< 18699 1726882335.25801: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882335.25810: _low_level_execute_command(): starting 18699 1726882335.25813: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882335.1339765-19149-138065340583382/AnsiballZ_stat.py && sleep 0' 18699 1726882335.26945: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882335.26964: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882335.26979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882335.27199: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882335.27226: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882335.27243: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882335.27268: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882335.27370: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882335.42504: stdout chunk (state=3): >>> <<< 18699 1726882335.42509: stdout chunk (state=3): >>>{"changed": false, "stat": {"exists": true, "path": "/sys/class/net/lsr27", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 29436, "dev": 23, "nlink": 1, "atime": 1726882333.7767122, "mtime": 1726882333.7767122, "ctime": 1726882333.7767122, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/lsr27", "lnk_target": "../../devices/virtual/net/lsr27", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/lsr27", "follow": false, "checksum_algorithm": "sha1"}}} <<< 18699 1726882335.43683: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 18699 1726882335.43717: stderr chunk (state=3): >>><<< 18699 1726882335.43720: stdout chunk (state=3): >>><<< 18699 1726882335.43734: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/lsr27", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 29436, "dev": 23, "nlink": 1, "atime": 1726882333.7767122, "mtime": 1726882333.7767122, "ctime": 1726882333.7767122, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/lsr27", "lnk_target": "../../devices/virtual/net/lsr27", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/lsr27", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 18699 1726882335.43770: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/lsr27', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882335.1339765-19149-138065340583382/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18699 1726882335.43779: _low_level_execute_command(): starting 18699 1726882335.43783: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882335.1339765-19149-138065340583382/ > /dev/null 2>&1 && sleep 0' 18699 1726882335.44219: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882335.44222: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882335.44225: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882335.44227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882335.44276: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882335.44279: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882335.44327: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882335.46100: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882335.46125: stderr chunk (state=3): >>><<< 18699 1726882335.46133: stdout chunk (state=3): >>><<< 18699 1726882335.46144: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882335.46150: handler run complete 18699 1726882335.46187: attempt loop complete, returning result 18699 1726882335.46191: _execute() done 18699 1726882335.46196: dumping result to json 18699 1726882335.46198: done dumping result, returning 18699 1726882335.46206: done running TaskExecutor() for managed_node1/TASK: Get stat for interface lsr27 [12673a56-9f93-1ce6-d207-00000000021e] 18699 1726882335.46209: sending task result for task 12673a56-9f93-1ce6-d207-00000000021e 18699 1726882335.46315: done sending task result for task 12673a56-9f93-1ce6-d207-00000000021e 18699 1726882335.46318: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "atime": 1726882333.7767122, "block_size": 4096, "blocks": 0, "ctime": 1726882333.7767122, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 29436, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/lsr27", "lnk_target": "../../devices/virtual/net/lsr27", "mode": "0777", "mtime": 1726882333.7767122, "nlink": 1, "path": "/sys/class/net/lsr27", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 18699 1726882335.46404: no more pending results, returning what we have 18699 1726882335.46409: results queue empty 18699 1726882335.46409: checking for any_errors_fatal 18699 1726882335.46411: done checking for any_errors_fatal 18699 1726882335.46411: checking for max_fail_percentage 18699 1726882335.46413: done checking for max_fail_percentage 18699 1726882335.46414: checking to see if all hosts have failed and the running result is not ok 18699 1726882335.46414: done checking to see if all hosts have failed 18699 1726882335.46415: getting the remaining hosts for this loop 18699 1726882335.46416: done getting the remaining hosts for this loop 18699 1726882335.46421: getting the next task for host managed_node1 18699 1726882335.46431: done getting next task for host managed_node1 18699 1726882335.46433: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 18699 1726882335.46436: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882335.46439: getting variables 18699 1726882335.46441: in VariableManager get_vars() 18699 1726882335.46468: Calling all_inventory to load vars for managed_node1 18699 1726882335.46471: Calling groups_inventory to load vars for managed_node1 18699 1726882335.46474: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882335.46484: Calling all_plugins_play to load vars for managed_node1 18699 1726882335.46486: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882335.46489: Calling groups_plugins_play to load vars for managed_node1 18699 1726882335.46628: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882335.46767: done with get_vars() 18699 1726882335.46774: done getting variables 18699 1726882335.46851: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 18699 1726882335.46936: variable 'interface' from source: set_fact TASK [Assert that the interface is present - 'lsr27'] ************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 21:32:15 -0400 (0:00:00.383) 0:00:09.065 ****** 18699 1726882335.46959: entering _queue_task() for managed_node1/assert 18699 1726882335.46960: Creating lock for assert 18699 1726882335.47160: worker is 1 (out of 1 available) 18699 1726882335.47172: exiting _queue_task() for managed_node1/assert 18699 1726882335.47182: done queuing things up, now waiting for results queue to drain 18699 1726882335.47183: waiting for pending results... 18699 1726882335.47337: running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'lsr27' 18699 1726882335.47389: in run() - task 12673a56-9f93-1ce6-d207-0000000001d4 18699 1726882335.47406: variable 'ansible_search_path' from source: unknown 18699 1726882335.47411: variable 'ansible_search_path' from source: unknown 18699 1726882335.47437: calling self._execute() 18699 1726882335.47494: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882335.47501: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882335.47510: variable 'omit' from source: magic vars 18699 1726882335.47762: variable 'ansible_distribution_major_version' from source: facts 18699 1726882335.47772: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882335.47777: variable 'omit' from source: magic vars 18699 1726882335.47806: variable 'omit' from source: magic vars 18699 1726882335.47874: variable 'interface' from source: set_fact 18699 1726882335.47887: variable 'omit' from source: magic vars 18699 1726882335.47921: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18699 1726882335.47946: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18699 1726882335.47965: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18699 1726882335.47978: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882335.47988: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882335.48015: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18699 1726882335.48018: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882335.48021: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882335.48086: Set connection var ansible_connection to ssh 18699 1726882335.48094: Set connection var ansible_pipelining to False 18699 1726882335.48103: Set connection var ansible_shell_executable to /bin/sh 18699 1726882335.48108: Set connection var ansible_timeout to 10 18699 1726882335.48111: Set connection var ansible_shell_type to sh 18699 1726882335.48116: Set connection var ansible_module_compression to ZIP_DEFLATED 18699 1726882335.48136: variable 'ansible_shell_executable' from source: unknown 18699 1726882335.48139: variable 'ansible_connection' from source: unknown 18699 1726882335.48141: variable 'ansible_module_compression' from source: unknown 18699 1726882335.48144: variable 'ansible_shell_type' from source: unknown 18699 1726882335.48146: variable 'ansible_shell_executable' from source: unknown 18699 1726882335.48148: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882335.48152: variable 'ansible_pipelining' from source: unknown 18699 1726882335.48154: variable 'ansible_timeout' from source: unknown 18699 1726882335.48158: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882335.48259: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18699 1726882335.48267: variable 'omit' from source: magic vars 18699 1726882335.48272: starting attempt loop 18699 1726882335.48275: running the handler 18699 1726882335.48367: variable 'interface_stat' from source: set_fact 18699 1726882335.48380: Evaluated conditional (interface_stat.stat.exists): True 18699 1726882335.48385: handler run complete 18699 1726882335.48400: attempt loop complete, returning result 18699 1726882335.48403: _execute() done 18699 1726882335.48406: dumping result to json 18699 1726882335.48408: done dumping result, returning 18699 1726882335.48415: done running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'lsr27' [12673a56-9f93-1ce6-d207-0000000001d4] 18699 1726882335.48418: sending task result for task 12673a56-9f93-1ce6-d207-0000000001d4 18699 1726882335.48492: done sending task result for task 12673a56-9f93-1ce6-d207-0000000001d4 18699 1726882335.48498: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 18699 1726882335.48560: no more pending results, returning what we have 18699 1726882335.48562: results queue empty 18699 1726882335.48563: checking for any_errors_fatal 18699 1726882335.48571: done checking for any_errors_fatal 18699 1726882335.48572: checking for max_fail_percentage 18699 1726882335.48573: done checking for max_fail_percentage 18699 1726882335.48574: checking to see if all hosts have failed and the running result is not ok 18699 1726882335.48575: done checking to see if all hosts have failed 18699 1726882335.48575: getting the remaining hosts for this loop 18699 1726882335.48577: done getting the remaining hosts for this loop 18699 1726882335.48580: getting the next task for host managed_node1 18699 1726882335.48586: done getting next task for host managed_node1 18699 1726882335.48587: ^ task is: TASK: meta (flush_handlers) 18699 1726882335.48589: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882335.48592: getting variables 18699 1726882335.48595: in VariableManager get_vars() 18699 1726882335.48619: Calling all_inventory to load vars for managed_node1 18699 1726882335.48622: Calling groups_inventory to load vars for managed_node1 18699 1726882335.48624: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882335.48632: Calling all_plugins_play to load vars for managed_node1 18699 1726882335.48635: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882335.48637: Calling groups_plugins_play to load vars for managed_node1 18699 1726882335.48746: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882335.48857: done with get_vars() 18699 1726882335.48864: done getting variables 18699 1726882335.48913: in VariableManager get_vars() 18699 1726882335.48919: Calling all_inventory to load vars for managed_node1 18699 1726882335.48921: Calling groups_inventory to load vars for managed_node1 18699 1726882335.48922: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882335.48925: Calling all_plugins_play to load vars for managed_node1 18699 1726882335.48927: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882335.48928: Calling groups_plugins_play to load vars for managed_node1 18699 1726882335.49030: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882335.49134: done with get_vars() 18699 1726882335.49143: done queuing things up, now waiting for results queue to drain 18699 1726882335.49144: results queue empty 18699 1726882335.49145: checking for any_errors_fatal 18699 1726882335.49146: done checking for any_errors_fatal 18699 1726882335.49146: checking for max_fail_percentage 18699 1726882335.49147: done checking for max_fail_percentage 18699 1726882335.49148: checking to see if all hosts have failed and the running result is not ok 18699 1726882335.49148: done checking to see if all hosts have failed 18699 1726882335.49153: getting the remaining hosts for this loop 18699 1726882335.49153: done getting the remaining hosts for this loop 18699 1726882335.49155: getting the next task for host managed_node1 18699 1726882335.49157: done getting next task for host managed_node1 18699 1726882335.49158: ^ task is: TASK: meta (flush_handlers) 18699 1726882335.49159: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882335.49161: getting variables 18699 1726882335.49161: in VariableManager get_vars() 18699 1726882335.49166: Calling all_inventory to load vars for managed_node1 18699 1726882335.49167: Calling groups_inventory to load vars for managed_node1 18699 1726882335.49169: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882335.49172: Calling all_plugins_play to load vars for managed_node1 18699 1726882335.49173: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882335.49175: Calling groups_plugins_play to load vars for managed_node1 18699 1726882335.49256: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882335.49359: done with get_vars() 18699 1726882335.49365: done getting variables 18699 1726882335.49396: in VariableManager get_vars() 18699 1726882335.49402: Calling all_inventory to load vars for managed_node1 18699 1726882335.49403: Calling groups_inventory to load vars for managed_node1 18699 1726882335.49404: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882335.49407: Calling all_plugins_play to load vars for managed_node1 18699 1726882335.49409: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882335.49410: Calling groups_plugins_play to load vars for managed_node1 18699 1726882335.49489: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882335.49611: done with get_vars() 18699 1726882335.49619: done queuing things up, now waiting for results queue to drain 18699 1726882335.49620: results queue empty 18699 1726882335.49621: checking for any_errors_fatal 18699 1726882335.49622: done checking for any_errors_fatal 18699 1726882335.49623: checking for max_fail_percentage 18699 1726882335.49623: done checking for max_fail_percentage 18699 1726882335.49624: checking to see if all hosts have failed and the running result is not ok 18699 1726882335.49624: done checking to see if all hosts have failed 18699 1726882335.49625: getting the remaining hosts for this loop 18699 1726882335.49625: done getting the remaining hosts for this loop 18699 1726882335.49627: getting the next task for host managed_node1 18699 1726882335.49628: done getting next task for host managed_node1 18699 1726882335.49629: ^ task is: None 18699 1726882335.49630: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882335.49631: done queuing things up, now waiting for results queue to drain 18699 1726882335.49631: results queue empty 18699 1726882335.49632: checking for any_errors_fatal 18699 1726882335.49632: done checking for any_errors_fatal 18699 1726882335.49633: checking for max_fail_percentage 18699 1726882335.49634: done checking for max_fail_percentage 18699 1726882335.49635: checking to see if all hosts have failed and the running result is not ok 18699 1726882335.49635: done checking to see if all hosts have failed 18699 1726882335.49636: getting the next task for host managed_node1 18699 1726882335.49638: done getting next task for host managed_node1 18699 1726882335.49639: ^ task is: None 18699 1726882335.49640: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882335.49672: in VariableManager get_vars() 18699 1726882335.49688: done with get_vars() 18699 1726882335.49691: in VariableManager get_vars() 18699 1726882335.49702: done with get_vars() 18699 1726882335.49705: variable 'omit' from source: magic vars 18699 1726882335.49725: in VariableManager get_vars() 18699 1726882335.49732: done with get_vars() 18699 1726882335.49746: variable 'omit' from source: magic vars PLAY [Test static interface up] ************************************************ 18699 1726882335.50113: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 18699 1726882335.50132: getting the remaining hosts for this loop 18699 1726882335.50133: done getting the remaining hosts for this loop 18699 1726882335.50134: getting the next task for host managed_node1 18699 1726882335.50136: done getting next task for host managed_node1 18699 1726882335.50137: ^ task is: TASK: Gathering Facts 18699 1726882335.50138: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882335.50139: getting variables 18699 1726882335.50140: in VariableManager get_vars() 18699 1726882335.50147: Calling all_inventory to load vars for managed_node1 18699 1726882335.50148: Calling groups_inventory to load vars for managed_node1 18699 1726882335.50150: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882335.50153: Calling all_plugins_play to load vars for managed_node1 18699 1726882335.50154: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882335.50156: Calling groups_plugins_play to load vars for managed_node1 18699 1726882335.50239: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882335.50367: done with get_vars() 18699 1726882335.50373: done getting variables 18699 1726882335.50402: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:33 Friday 20 September 2024 21:32:15 -0400 (0:00:00.034) 0:00:09.100 ****** 18699 1726882335.50418: entering _queue_task() for managed_node1/gather_facts 18699 1726882335.50607: worker is 1 (out of 1 available) 18699 1726882335.50620: exiting _queue_task() for managed_node1/gather_facts 18699 1726882335.50631: done queuing things up, now waiting for results queue to drain 18699 1726882335.50633: waiting for pending results... 18699 1726882335.50778: running TaskExecutor() for managed_node1/TASK: Gathering Facts 18699 1726882335.50838: in run() - task 12673a56-9f93-1ce6-d207-000000000237 18699 1726882335.50852: variable 'ansible_search_path' from source: unknown 18699 1726882335.50880: calling self._execute() 18699 1726882335.50941: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882335.50945: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882335.50953: variable 'omit' from source: magic vars 18699 1726882335.51212: variable 'ansible_distribution_major_version' from source: facts 18699 1726882335.51222: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882335.51228: variable 'omit' from source: magic vars 18699 1726882335.51245: variable 'omit' from source: magic vars 18699 1726882335.51270: variable 'omit' from source: magic vars 18699 1726882335.51306: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18699 1726882335.51331: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18699 1726882335.51347: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18699 1726882335.51360: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882335.51369: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882335.51391: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18699 1726882335.51396: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882335.51404: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882335.51468: Set connection var ansible_connection to ssh 18699 1726882335.51474: Set connection var ansible_pipelining to False 18699 1726882335.51479: Set connection var ansible_shell_executable to /bin/sh 18699 1726882335.51485: Set connection var ansible_timeout to 10 18699 1726882335.51487: Set connection var ansible_shell_type to sh 18699 1726882335.51491: Set connection var ansible_module_compression to ZIP_DEFLATED 18699 1726882335.51520: variable 'ansible_shell_executable' from source: unknown 18699 1726882335.51524: variable 'ansible_connection' from source: unknown 18699 1726882335.51526: variable 'ansible_module_compression' from source: unknown 18699 1726882335.51529: variable 'ansible_shell_type' from source: unknown 18699 1726882335.51531: variable 'ansible_shell_executable' from source: unknown 18699 1726882335.51533: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882335.51535: variable 'ansible_pipelining' from source: unknown 18699 1726882335.51537: variable 'ansible_timeout' from source: unknown 18699 1726882335.51539: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882335.51667: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18699 1726882335.51675: variable 'omit' from source: magic vars 18699 1726882335.51679: starting attempt loop 18699 1726882335.51682: running the handler 18699 1726882335.51696: variable 'ansible_facts' from source: unknown 18699 1726882335.51712: _low_level_execute_command(): starting 18699 1726882335.51719: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18699 1726882335.52229: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882335.52234: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882335.52237: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882335.52286: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882335.52289: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882335.52291: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882335.52338: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882335.53898: stdout chunk (state=3): >>>/root <<< 18699 1726882335.53991: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882335.54020: stderr chunk (state=3): >>><<< 18699 1726882335.54025: stdout chunk (state=3): >>><<< 18699 1726882335.54042: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882335.54054: _low_level_execute_command(): starting 18699 1726882335.54061: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882335.540413-19177-21573167123693 `" && echo ansible-tmp-1726882335.540413-19177-21573167123693="` echo /root/.ansible/tmp/ansible-tmp-1726882335.540413-19177-21573167123693 `" ) && sleep 0' 18699 1726882335.54469: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882335.54473: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882335.54489: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882335.54544: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882335.54548: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882335.54599: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882335.56457: stdout chunk (state=3): >>>ansible-tmp-1726882335.540413-19177-21573167123693=/root/.ansible/tmp/ansible-tmp-1726882335.540413-19177-21573167123693 <<< 18699 1726882335.56564: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882335.56588: stderr chunk (state=3): >>><<< 18699 1726882335.56591: stdout chunk (state=3): >>><<< 18699 1726882335.56609: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882335.540413-19177-21573167123693=/root/.ansible/tmp/ansible-tmp-1726882335.540413-19177-21573167123693 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882335.56634: variable 'ansible_module_compression' from source: unknown 18699 1726882335.56687: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18699f6i6z5dg/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 18699 1726882335.56737: variable 'ansible_facts' from source: unknown 18699 1726882335.56868: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882335.540413-19177-21573167123693/AnsiballZ_setup.py 18699 1726882335.56968: Sending initial data 18699 1726882335.56972: Sent initial data (152 bytes) 18699 1726882335.57424: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882335.57428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 18699 1726882335.57430: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882335.57432: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 18699 1726882335.57434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882335.57477: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882335.57480: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882335.57527: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882335.59022: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 18699 1726882335.59026: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18699 1726882335.59062: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18699 1726882335.59108: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmpagxzdvgs /root/.ansible/tmp/ansible-tmp-1726882335.540413-19177-21573167123693/AnsiballZ_setup.py <<< 18699 1726882335.59111: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882335.540413-19177-21573167123693/AnsiballZ_setup.py" <<< 18699 1726882335.59147: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmpagxzdvgs" to remote "/root/.ansible/tmp/ansible-tmp-1726882335.540413-19177-21573167123693/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882335.540413-19177-21573167123693/AnsiballZ_setup.py" <<< 18699 1726882335.60162: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882335.60334: stderr chunk (state=3): >>><<< 18699 1726882335.60337: stdout chunk (state=3): >>><<< 18699 1726882335.60340: done transferring module to remote 18699 1726882335.60342: _low_level_execute_command(): starting 18699 1726882335.60344: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882335.540413-19177-21573167123693/ /root/.ansible/tmp/ansible-tmp-1726882335.540413-19177-21573167123693/AnsiballZ_setup.py && sleep 0' 18699 1726882335.61086: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882335.61158: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882335.61213: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882335.61228: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882335.61253: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882335.61322: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882335.63300: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882335.63308: stdout chunk (state=3): >>><<< 18699 1726882335.63310: stderr chunk (state=3): >>><<< 18699 1726882335.63313: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882335.63315: _low_level_execute_command(): starting 18699 1726882335.63317: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882335.540413-19177-21573167123693/AnsiballZ_setup.py && sleep 0' 18699 1726882335.64476: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882335.64515: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882335.64533: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882335.64712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882335.64749: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882335.64866: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882335.64886: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882335.64958: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882336.27155: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-159", "ansible_nodename": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d2d02cced42c36436217cb93f6b8e", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fips": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_local": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_is_chroot": false, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC9sgyYGKGPd0JFIDKIZZNkcX78Ca8OmX4GnOCt150Ftpgzzfir9Dy2HOb7d6QbQheoi9HLkHb66U2LDdt7EnBGKnI12YAuydTDfqITc2L4W9cEeoy/f2rrMlBo6FN3SNQc2voCDsWius2gK2mtTTZZI0R33PguMmqTkwYVzP0hYplwSYh5Atl+XP7/xLRhhowanh9U6x2ahqfnNq5DInqi070bKk0xZ2g12Vg8kIRno8ZQmm+ujUUevRkZysHvnrnN01ZQhqzjo/Awn+Pft6LYleTBn+YU/HlPMWR4PsFcrtT3WRdF5samSvVwWuuOC+0td2zQN4nGpYLK+FmpNG4nDfGZV/xIBBblNRvzrhKgk3lDU5qkeQ/R0godRQGbv4J1kq+3WU2E3upqBYxXWUJLM5FirAxz8tKLmaPh8YZWMKcs3X9F2ySLEcnhe5R5F6LFSNx13zQSt7lGZOIgzhvWllcs4YVxcV1Y4rTJ8jEK2KgWua+bZinJPYUJqKTzO2E=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKk0X8hfHP7BSAAI8BDwrr4175ddN6MsanEqlp3oVMOvThKVXLpFXhvJPbq2IBTd3Wm12dL2vAW7/82zG63KYZk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDVN13dHSxa36Blsqt/Q8OyOA04CC7ZlvrS6zWL4aDyE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_pkg_mgr": "dnf", "ansible_iscsi_iqn": "", "ansible_loadavg": {"1m": 0.45263671875, "5m": 0.31787109375, "15m": 0.15576171875}, "ansible_fibre_channel_wwn": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "32", "second": "15", "epoch": "1726882335", "epoch_int": "1726882335", "date": "2024-09-20", "time": "21:32:15", "iso8601_micro": "2024-09-21T01:32:15.922209Z", "iso8601": "2024-09-21T01:32:15Z", "iso8601_basic": "20240920T213215922209", "iso8601_basic_short": "20240920T213215", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_lsb": {}, "ansible_interfaces": ["lsr27", "eth0", "lo", "peerlsr27"], "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "c2:5a:92:97:66:08", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::c05a:92ff:fe97:6608", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_s<<< 18699 1726882336.27180: stdout chunk (state=3): >>>egmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1030:bff:fea1:4223", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lsr27": {"device": "lsr27", "macaddress": "7a:fe:b4:01:4b:ee", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::78fe:b4ff:fe01:4bee", "prefix": "64<<< 18699 1726882336.27189: stdout chunk (state=3): >>>", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::c05a:92ff:fe97:6608", "fe80::1030:bff:fea1:4223", "fe80::78fe:b4ff:fe01:4bee"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1030:bff:fea1:4223"]}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 52586 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 52586 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2933, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 598, "free": 2933}, "nocache": {"free": 3271, "used": 260}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_uuid": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 769, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794738176, "block_size": 4096, "block_total": 65519099, "block_available": 63914731, "block_used": 1604368, "inode_total": 131070960, "inode_available": 131029047, "inode_used": 41913, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 18699 1726882336.29118: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 18699 1726882336.29149: stderr chunk (state=3): >>><<< 18699 1726882336.29152: stdout chunk (state=3): >>><<< 18699 1726882336.29184: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-159", "ansible_nodename": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d2d02cced42c36436217cb93f6b8e", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fips": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_local": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_is_chroot": false, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC9sgyYGKGPd0JFIDKIZZNkcX78Ca8OmX4GnOCt150Ftpgzzfir9Dy2HOb7d6QbQheoi9HLkHb66U2LDdt7EnBGKnI12YAuydTDfqITc2L4W9cEeoy/f2rrMlBo6FN3SNQc2voCDsWius2gK2mtTTZZI0R33PguMmqTkwYVzP0hYplwSYh5Atl+XP7/xLRhhowanh9U6x2ahqfnNq5DInqi070bKk0xZ2g12Vg8kIRno8ZQmm+ujUUevRkZysHvnrnN01ZQhqzjo/Awn+Pft6LYleTBn+YU/HlPMWR4PsFcrtT3WRdF5samSvVwWuuOC+0td2zQN4nGpYLK+FmpNG4nDfGZV/xIBBblNRvzrhKgk3lDU5qkeQ/R0godRQGbv4J1kq+3WU2E3upqBYxXWUJLM5FirAxz8tKLmaPh8YZWMKcs3X9F2ySLEcnhe5R5F6LFSNx13zQSt7lGZOIgzhvWllcs4YVxcV1Y4rTJ8jEK2KgWua+bZinJPYUJqKTzO2E=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKk0X8hfHP7BSAAI8BDwrr4175ddN6MsanEqlp3oVMOvThKVXLpFXhvJPbq2IBTd3Wm12dL2vAW7/82zG63KYZk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDVN13dHSxa36Blsqt/Q8OyOA04CC7ZlvrS6zWL4aDyE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_pkg_mgr": "dnf", "ansible_iscsi_iqn": "", "ansible_loadavg": {"1m": 0.45263671875, "5m": 0.31787109375, "15m": 0.15576171875}, "ansible_fibre_channel_wwn": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "32", "second": "15", "epoch": "1726882335", "epoch_int": "1726882335", "date": "2024-09-20", "time": "21:32:15", "iso8601_micro": "2024-09-21T01:32:15.922209Z", "iso8601": "2024-09-21T01:32:15Z", "iso8601_basic": "20240920T213215922209", "iso8601_basic_short": "20240920T213215", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_lsb": {}, "ansible_interfaces": ["lsr27", "eth0", "lo", "peerlsr27"], "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "c2:5a:92:97:66:08", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::c05a:92ff:fe97:6608", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1030:bff:fea1:4223", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lsr27": {"device": "lsr27", "macaddress": "7a:fe:b4:01:4b:ee", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::78fe:b4ff:fe01:4bee", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::c05a:92ff:fe97:6608", "fe80::1030:bff:fea1:4223", "fe80::78fe:b4ff:fe01:4bee"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1030:bff:fea1:4223"]}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 52586 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 52586 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2933, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 598, "free": 2933}, "nocache": {"free": 3271, "used": 260}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_uuid": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 769, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794738176, "block_size": 4096, "block_total": 65519099, "block_available": 63914731, "block_used": 1604368, "inode_total": 131070960, "inode_available": 131029047, "inode_used": 41913, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 18699 1726882336.29663: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882335.540413-19177-21573167123693/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18699 1726882336.29679: _low_level_execute_command(): starting 18699 1726882336.29683: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882335.540413-19177-21573167123693/ > /dev/null 2>&1 && sleep 0' 18699 1726882336.30136: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882336.30141: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 18699 1726882336.30144: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882336.30146: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882336.30148: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882336.30200: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882336.30204: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882336.30251: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882336.32039: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882336.32065: stderr chunk (state=3): >>><<< 18699 1726882336.32069: stdout chunk (state=3): >>><<< 18699 1726882336.32081: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882336.32087: handler run complete 18699 1726882336.32172: variable 'ansible_facts' from source: unknown 18699 1726882336.32241: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882336.32448: variable 'ansible_facts' from source: unknown 18699 1726882336.32508: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882336.32591: attempt loop complete, returning result 18699 1726882336.32596: _execute() done 18699 1726882336.32602: dumping result to json 18699 1726882336.32626: done dumping result, returning 18699 1726882336.32634: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [12673a56-9f93-1ce6-d207-000000000237] 18699 1726882336.32637: sending task result for task 12673a56-9f93-1ce6-d207-000000000237 18699 1726882336.33165: done sending task result for task 12673a56-9f93-1ce6-d207-000000000237 18699 1726882336.33168: WORKER PROCESS EXITING ok: [managed_node1] 18699 1726882336.33341: no more pending results, returning what we have 18699 1726882336.33343: results queue empty 18699 1726882336.33344: checking for any_errors_fatal 18699 1726882336.33345: done checking for any_errors_fatal 18699 1726882336.33345: checking for max_fail_percentage 18699 1726882336.33346: done checking for max_fail_percentage 18699 1726882336.33347: checking to see if all hosts have failed and the running result is not ok 18699 1726882336.33347: done checking to see if all hosts have failed 18699 1726882336.33348: getting the remaining hosts for this loop 18699 1726882336.33348: done getting the remaining hosts for this loop 18699 1726882336.33351: getting the next task for host managed_node1 18699 1726882336.33354: done getting next task for host managed_node1 18699 1726882336.33355: ^ task is: TASK: meta (flush_handlers) 18699 1726882336.33356: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882336.33358: getting variables 18699 1726882336.33359: in VariableManager get_vars() 18699 1726882336.33377: Calling all_inventory to load vars for managed_node1 18699 1726882336.33379: Calling groups_inventory to load vars for managed_node1 18699 1726882336.33380: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882336.33388: Calling all_plugins_play to load vars for managed_node1 18699 1726882336.33389: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882336.33391: Calling groups_plugins_play to load vars for managed_node1 18699 1726882336.33498: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882336.33624: done with get_vars() 18699 1726882336.33632: done getting variables 18699 1726882336.33678: in VariableManager get_vars() 18699 1726882336.33686: Calling all_inventory to load vars for managed_node1 18699 1726882336.33688: Calling groups_inventory to load vars for managed_node1 18699 1726882336.33689: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882336.33692: Calling all_plugins_play to load vars for managed_node1 18699 1726882336.33698: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882336.33700: Calling groups_plugins_play to load vars for managed_node1 18699 1726882336.33787: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882336.33926: done with get_vars() 18699 1726882336.33934: done queuing things up, now waiting for results queue to drain 18699 1726882336.33936: results queue empty 18699 1726882336.33936: checking for any_errors_fatal 18699 1726882336.33938: done checking for any_errors_fatal 18699 1726882336.33939: checking for max_fail_percentage 18699 1726882336.33939: done checking for max_fail_percentage 18699 1726882336.33943: checking to see if all hosts have failed and the running result is not ok 18699 1726882336.33944: done checking to see if all hosts have failed 18699 1726882336.33944: getting the remaining hosts for this loop 18699 1726882336.33945: done getting the remaining hosts for this loop 18699 1726882336.33947: getting the next task for host managed_node1 18699 1726882336.33949: done getting next task for host managed_node1 18699 1726882336.33951: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 18699 1726882336.33952: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882336.33959: getting variables 18699 1726882336.33959: in VariableManager get_vars() 18699 1726882336.33967: Calling all_inventory to load vars for managed_node1 18699 1726882336.33968: Calling groups_inventory to load vars for managed_node1 18699 1726882336.33969: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882336.33972: Calling all_plugins_play to load vars for managed_node1 18699 1726882336.33974: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882336.33977: Calling groups_plugins_play to load vars for managed_node1 18699 1726882336.34065: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882336.34188: done with get_vars() 18699 1726882336.34198: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:32:16 -0400 (0:00:00.838) 0:00:09.938 ****** 18699 1726882336.34244: entering _queue_task() for managed_node1/include_tasks 18699 1726882336.34452: worker is 1 (out of 1 available) 18699 1726882336.34464: exiting _queue_task() for managed_node1/include_tasks 18699 1726882336.34474: done queuing things up, now waiting for results queue to drain 18699 1726882336.34475: waiting for pending results... 18699 1726882336.34631: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 18699 1726882336.34690: in run() - task 12673a56-9f93-1ce6-d207-000000000019 18699 1726882336.34706: variable 'ansible_search_path' from source: unknown 18699 1726882336.34710: variable 'ansible_search_path' from source: unknown 18699 1726882336.34735: calling self._execute() 18699 1726882336.34805: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882336.34809: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882336.34813: variable 'omit' from source: magic vars 18699 1726882336.35072: variable 'ansible_distribution_major_version' from source: facts 18699 1726882336.35082: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882336.35088: _execute() done 18699 1726882336.35092: dumping result to json 18699 1726882336.35098: done dumping result, returning 18699 1726882336.35102: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12673a56-9f93-1ce6-d207-000000000019] 18699 1726882336.35107: sending task result for task 12673a56-9f93-1ce6-d207-000000000019 18699 1726882336.35189: done sending task result for task 12673a56-9f93-1ce6-d207-000000000019 18699 1726882336.35192: WORKER PROCESS EXITING 18699 1726882336.35230: no more pending results, returning what we have 18699 1726882336.35234: in VariableManager get_vars() 18699 1726882336.35271: Calling all_inventory to load vars for managed_node1 18699 1726882336.35274: Calling groups_inventory to load vars for managed_node1 18699 1726882336.35276: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882336.35285: Calling all_plugins_play to load vars for managed_node1 18699 1726882336.35289: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882336.35292: Calling groups_plugins_play to load vars for managed_node1 18699 1726882336.35468: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882336.35600: done with get_vars() 18699 1726882336.35606: variable 'ansible_search_path' from source: unknown 18699 1726882336.35607: variable 'ansible_search_path' from source: unknown 18699 1726882336.35627: we have included files to process 18699 1726882336.35627: generating all_blocks data 18699 1726882336.35628: done generating all_blocks data 18699 1726882336.35629: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 18699 1726882336.35630: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 18699 1726882336.35631: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 18699 1726882336.36070: done processing included file 18699 1726882336.36071: iterating over new_blocks loaded from include file 18699 1726882336.36072: in VariableManager get_vars() 18699 1726882336.36084: done with get_vars() 18699 1726882336.36085: filtering new block on tags 18699 1726882336.36101: done filtering new block on tags 18699 1726882336.36103: in VariableManager get_vars() 18699 1726882336.36115: done with get_vars() 18699 1726882336.36116: filtering new block on tags 18699 1726882336.36127: done filtering new block on tags 18699 1726882336.36128: in VariableManager get_vars() 18699 1726882336.36138: done with get_vars() 18699 1726882336.36139: filtering new block on tags 18699 1726882336.36148: done filtering new block on tags 18699 1726882336.36149: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node1 18699 1726882336.36152: extending task lists for all hosts with included blocks 18699 1726882336.36360: done extending task lists 18699 1726882336.36361: done processing included files 18699 1726882336.36362: results queue empty 18699 1726882336.36362: checking for any_errors_fatal 18699 1726882336.36363: done checking for any_errors_fatal 18699 1726882336.36364: checking for max_fail_percentage 18699 1726882336.36364: done checking for max_fail_percentage 18699 1726882336.36365: checking to see if all hosts have failed and the running result is not ok 18699 1726882336.36365: done checking to see if all hosts have failed 18699 1726882336.36366: getting the remaining hosts for this loop 18699 1726882336.36367: done getting the remaining hosts for this loop 18699 1726882336.36368: getting the next task for host managed_node1 18699 1726882336.36371: done getting next task for host managed_node1 18699 1726882336.36372: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 18699 1726882336.36374: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882336.36380: getting variables 18699 1726882336.36381: in VariableManager get_vars() 18699 1726882336.36390: Calling all_inventory to load vars for managed_node1 18699 1726882336.36392: Calling groups_inventory to load vars for managed_node1 18699 1726882336.36396: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882336.36400: Calling all_plugins_play to load vars for managed_node1 18699 1726882336.36401: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882336.36403: Calling groups_plugins_play to load vars for managed_node1 18699 1726882336.36511: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882336.36643: done with get_vars() 18699 1726882336.36649: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:32:16 -0400 (0:00:00.024) 0:00:09.963 ****** 18699 1726882336.36697: entering _queue_task() for managed_node1/setup 18699 1726882336.36907: worker is 1 (out of 1 available) 18699 1726882336.36920: exiting _queue_task() for managed_node1/setup 18699 1726882336.36930: done queuing things up, now waiting for results queue to drain 18699 1726882336.36931: waiting for pending results... 18699 1726882336.37077: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 18699 1726882336.37149: in run() - task 12673a56-9f93-1ce6-d207-000000000279 18699 1726882336.37163: variable 'ansible_search_path' from source: unknown 18699 1726882336.37166: variable 'ansible_search_path' from source: unknown 18699 1726882336.37198: calling self._execute() 18699 1726882336.37255: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882336.37259: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882336.37268: variable 'omit' from source: magic vars 18699 1726882336.37539: variable 'ansible_distribution_major_version' from source: facts 18699 1726882336.37548: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882336.37689: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18699 1726882336.39125: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18699 1726882336.39167: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18699 1726882336.39196: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18699 1726882336.39225: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18699 1726882336.39246: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18699 1726882336.39304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882336.39337: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882336.39354: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882336.39379: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882336.39390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882336.39433: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882336.39450: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882336.39467: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882336.39491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882336.39506: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882336.39609: variable '__network_required_facts' from source: role '' defaults 18699 1726882336.39616: variable 'ansible_facts' from source: unknown 18699 1726882336.39682: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 18699 1726882336.39686: when evaluation is False, skipping this task 18699 1726882336.39689: _execute() done 18699 1726882336.39692: dumping result to json 18699 1726882336.39696: done dumping result, returning 18699 1726882336.39704: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12673a56-9f93-1ce6-d207-000000000279] 18699 1726882336.39706: sending task result for task 12673a56-9f93-1ce6-d207-000000000279 18699 1726882336.39787: done sending task result for task 12673a56-9f93-1ce6-d207-000000000279 18699 1726882336.39789: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18699 1726882336.39833: no more pending results, returning what we have 18699 1726882336.39837: results queue empty 18699 1726882336.39837: checking for any_errors_fatal 18699 1726882336.39839: done checking for any_errors_fatal 18699 1726882336.39839: checking for max_fail_percentage 18699 1726882336.39841: done checking for max_fail_percentage 18699 1726882336.39842: checking to see if all hosts have failed and the running result is not ok 18699 1726882336.39842: done checking to see if all hosts have failed 18699 1726882336.39843: getting the remaining hosts for this loop 18699 1726882336.39844: done getting the remaining hosts for this loop 18699 1726882336.39848: getting the next task for host managed_node1 18699 1726882336.39856: done getting next task for host managed_node1 18699 1726882336.39860: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 18699 1726882336.39862: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882336.39874: getting variables 18699 1726882336.39876: in VariableManager get_vars() 18699 1726882336.39914: Calling all_inventory to load vars for managed_node1 18699 1726882336.39917: Calling groups_inventory to load vars for managed_node1 18699 1726882336.39919: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882336.39929: Calling all_plugins_play to load vars for managed_node1 18699 1726882336.39931: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882336.39933: Calling groups_plugins_play to load vars for managed_node1 18699 1726882336.40086: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882336.40252: done with get_vars() 18699 1726882336.40260: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:32:16 -0400 (0:00:00.036) 0:00:09.999 ****** 18699 1726882336.40325: entering _queue_task() for managed_node1/stat 18699 1726882336.40520: worker is 1 (out of 1 available) 18699 1726882336.40535: exiting _queue_task() for managed_node1/stat 18699 1726882336.40544: done queuing things up, now waiting for results queue to drain 18699 1726882336.40545: waiting for pending results... 18699 1726882336.40694: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 18699 1726882336.40776: in run() - task 12673a56-9f93-1ce6-d207-00000000027b 18699 1726882336.40786: variable 'ansible_search_path' from source: unknown 18699 1726882336.40790: variable 'ansible_search_path' from source: unknown 18699 1726882336.40820: calling self._execute() 18699 1726882336.40884: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882336.40887: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882336.40899: variable 'omit' from source: magic vars 18699 1726882336.41152: variable 'ansible_distribution_major_version' from source: facts 18699 1726882336.41161: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882336.41267: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18699 1726882336.41454: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18699 1726882336.41486: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18699 1726882336.41513: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18699 1726882336.41540: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18699 1726882336.41600: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18699 1726882336.41616: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18699 1726882336.41635: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882336.41655: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18699 1726882336.41717: variable '__network_is_ostree' from source: set_fact 18699 1726882336.41723: Evaluated conditional (not __network_is_ostree is defined): False 18699 1726882336.41725: when evaluation is False, skipping this task 18699 1726882336.41728: _execute() done 18699 1726882336.41730: dumping result to json 18699 1726882336.41734: done dumping result, returning 18699 1726882336.41740: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [12673a56-9f93-1ce6-d207-00000000027b] 18699 1726882336.41743: sending task result for task 12673a56-9f93-1ce6-d207-00000000027b 18699 1726882336.41825: done sending task result for task 12673a56-9f93-1ce6-d207-00000000027b 18699 1726882336.41827: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 18699 1726882336.41904: no more pending results, returning what we have 18699 1726882336.41907: results queue empty 18699 1726882336.41908: checking for any_errors_fatal 18699 1726882336.41914: done checking for any_errors_fatal 18699 1726882336.41914: checking for max_fail_percentage 18699 1726882336.41916: done checking for max_fail_percentage 18699 1726882336.41917: checking to see if all hosts have failed and the running result is not ok 18699 1726882336.41918: done checking to see if all hosts have failed 18699 1726882336.41918: getting the remaining hosts for this loop 18699 1726882336.41919: done getting the remaining hosts for this loop 18699 1726882336.41922: getting the next task for host managed_node1 18699 1726882336.41927: done getting next task for host managed_node1 18699 1726882336.41930: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 18699 1726882336.41932: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882336.41943: getting variables 18699 1726882336.41945: in VariableManager get_vars() 18699 1726882336.41973: Calling all_inventory to load vars for managed_node1 18699 1726882336.41975: Calling groups_inventory to load vars for managed_node1 18699 1726882336.41978: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882336.41984: Calling all_plugins_play to load vars for managed_node1 18699 1726882336.41986: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882336.41988: Calling groups_plugins_play to load vars for managed_node1 18699 1726882336.42112: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882336.42248: done with get_vars() 18699 1726882336.42255: done getting variables 18699 1726882336.42291: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:32:16 -0400 (0:00:00.019) 0:00:10.019 ****** 18699 1726882336.42319: entering _queue_task() for managed_node1/set_fact 18699 1726882336.42505: worker is 1 (out of 1 available) 18699 1726882336.42518: exiting _queue_task() for managed_node1/set_fact 18699 1726882336.42530: done queuing things up, now waiting for results queue to drain 18699 1726882336.42531: waiting for pending results... 18699 1726882336.42671: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 18699 1726882336.42737: in run() - task 12673a56-9f93-1ce6-d207-00000000027c 18699 1726882336.42748: variable 'ansible_search_path' from source: unknown 18699 1726882336.42754: variable 'ansible_search_path' from source: unknown 18699 1726882336.42778: calling self._execute() 18699 1726882336.42842: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882336.42845: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882336.42855: variable 'omit' from source: magic vars 18699 1726882336.43107: variable 'ansible_distribution_major_version' from source: facts 18699 1726882336.43116: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882336.43223: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18699 1726882336.43467: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18699 1726882336.43500: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18699 1726882336.43525: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18699 1726882336.43549: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18699 1726882336.43610: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18699 1726882336.43629: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18699 1726882336.43649: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882336.43666: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18699 1726882336.43727: variable '__network_is_ostree' from source: set_fact 18699 1726882336.43733: Evaluated conditional (not __network_is_ostree is defined): False 18699 1726882336.43736: when evaluation is False, skipping this task 18699 1726882336.43739: _execute() done 18699 1726882336.43741: dumping result to json 18699 1726882336.43746: done dumping result, returning 18699 1726882336.43754: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12673a56-9f93-1ce6-d207-00000000027c] 18699 1726882336.43757: sending task result for task 12673a56-9f93-1ce6-d207-00000000027c 18699 1726882336.43837: done sending task result for task 12673a56-9f93-1ce6-d207-00000000027c 18699 1726882336.43840: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 18699 1726882336.43901: no more pending results, returning what we have 18699 1726882336.43904: results queue empty 18699 1726882336.43905: checking for any_errors_fatal 18699 1726882336.43909: done checking for any_errors_fatal 18699 1726882336.43910: checking for max_fail_percentage 18699 1726882336.43911: done checking for max_fail_percentage 18699 1726882336.43912: checking to see if all hosts have failed and the running result is not ok 18699 1726882336.43913: done checking to see if all hosts have failed 18699 1726882336.43914: getting the remaining hosts for this loop 18699 1726882336.43915: done getting the remaining hosts for this loop 18699 1726882336.43918: getting the next task for host managed_node1 18699 1726882336.43925: done getting next task for host managed_node1 18699 1726882336.43929: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 18699 1726882336.43931: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882336.43943: getting variables 18699 1726882336.43945: in VariableManager get_vars() 18699 1726882336.43976: Calling all_inventory to load vars for managed_node1 18699 1726882336.43979: Calling groups_inventory to load vars for managed_node1 18699 1726882336.43981: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882336.43988: Calling all_plugins_play to load vars for managed_node1 18699 1726882336.43991: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882336.43997: Calling groups_plugins_play to load vars for managed_node1 18699 1726882336.44158: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882336.44297: done with get_vars() 18699 1726882336.44305: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:32:16 -0400 (0:00:00.020) 0:00:10.039 ****** 18699 1726882336.44365: entering _queue_task() for managed_node1/service_facts 18699 1726882336.44366: Creating lock for service_facts 18699 1726882336.44562: worker is 1 (out of 1 available) 18699 1726882336.44576: exiting _queue_task() for managed_node1/service_facts 18699 1726882336.44586: done queuing things up, now waiting for results queue to drain 18699 1726882336.44587: waiting for pending results... 18699 1726882336.44738: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running 18699 1726882336.44799: in run() - task 12673a56-9f93-1ce6-d207-00000000027e 18699 1726882336.44811: variable 'ansible_search_path' from source: unknown 18699 1726882336.44817: variable 'ansible_search_path' from source: unknown 18699 1726882336.44841: calling self._execute() 18699 1726882336.44902: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882336.44906: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882336.44916: variable 'omit' from source: magic vars 18699 1726882336.45169: variable 'ansible_distribution_major_version' from source: facts 18699 1726882336.45178: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882336.45183: variable 'omit' from source: magic vars 18699 1726882336.45223: variable 'omit' from source: magic vars 18699 1726882336.45248: variable 'omit' from source: magic vars 18699 1726882336.45279: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18699 1726882336.45307: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18699 1726882336.45323: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18699 1726882336.45336: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882336.45345: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882336.45369: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18699 1726882336.45372: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882336.45375: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882336.45440: Set connection var ansible_connection to ssh 18699 1726882336.45447: Set connection var ansible_pipelining to False 18699 1726882336.45452: Set connection var ansible_shell_executable to /bin/sh 18699 1726882336.45457: Set connection var ansible_timeout to 10 18699 1726882336.45461: Set connection var ansible_shell_type to sh 18699 1726882336.45464: Set connection var ansible_module_compression to ZIP_DEFLATED 18699 1726882336.45487: variable 'ansible_shell_executable' from source: unknown 18699 1726882336.45490: variable 'ansible_connection' from source: unknown 18699 1726882336.45497: variable 'ansible_module_compression' from source: unknown 18699 1726882336.45500: variable 'ansible_shell_type' from source: unknown 18699 1726882336.45502: variable 'ansible_shell_executable' from source: unknown 18699 1726882336.45504: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882336.45506: variable 'ansible_pipelining' from source: unknown 18699 1726882336.45509: variable 'ansible_timeout' from source: unknown 18699 1726882336.45511: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882336.45647: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18699 1726882336.45654: variable 'omit' from source: magic vars 18699 1726882336.45658: starting attempt loop 18699 1726882336.45660: running the handler 18699 1726882336.45672: _low_level_execute_command(): starting 18699 1726882336.45679: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18699 1726882336.46189: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882336.46192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882336.46200: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 18699 1726882336.46203: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882336.46257: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882336.46265: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882336.46267: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882336.46310: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882336.47955: stdout chunk (state=3): >>>/root <<< 18699 1726882336.48056: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882336.48086: stderr chunk (state=3): >>><<< 18699 1726882336.48090: stdout chunk (state=3): >>><<< 18699 1726882336.48110: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882336.48121: _low_level_execute_command(): starting 18699 1726882336.48126: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882336.4811006-19228-224505638274806 `" && echo ansible-tmp-1726882336.4811006-19228-224505638274806="` echo /root/.ansible/tmp/ansible-tmp-1726882336.4811006-19228-224505638274806 `" ) && sleep 0' 18699 1726882336.48558: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882336.48561: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 18699 1726882336.48563: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882336.48573: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882336.48575: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882336.48622: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882336.48625: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882336.48671: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882336.50543: stdout chunk (state=3): >>>ansible-tmp-1726882336.4811006-19228-224505638274806=/root/.ansible/tmp/ansible-tmp-1726882336.4811006-19228-224505638274806 <<< 18699 1726882336.50647: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882336.50672: stderr chunk (state=3): >>><<< 18699 1726882336.50675: stdout chunk (state=3): >>><<< 18699 1726882336.50689: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882336.4811006-19228-224505638274806=/root/.ansible/tmp/ansible-tmp-1726882336.4811006-19228-224505638274806 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882336.50731: variable 'ansible_module_compression' from source: unknown 18699 1726882336.50765: ANSIBALLZ: Using lock for service_facts 18699 1726882336.50769: ANSIBALLZ: Acquiring lock 18699 1726882336.50773: ANSIBALLZ: Lock acquired: 140254442388464 18699 1726882336.50775: ANSIBALLZ: Creating module 18699 1726882336.59406: ANSIBALLZ: Writing module into payload 18699 1726882336.59410: ANSIBALLZ: Writing module 18699 1726882336.59413: ANSIBALLZ: Renaming module 18699 1726882336.59415: ANSIBALLZ: Done creating module 18699 1726882336.59416: variable 'ansible_facts' from source: unknown 18699 1726882336.59418: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882336.4811006-19228-224505638274806/AnsiballZ_service_facts.py 18699 1726882336.59532: Sending initial data 18699 1726882336.59548: Sent initial data (162 bytes) 18699 1726882336.59985: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882336.60005: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882336.60016: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882336.60054: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882336.60066: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882336.60117: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882336.61678: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18699 1726882336.61715: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18699 1726882336.61781: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmpd54ttx18 /root/.ansible/tmp/ansible-tmp-1726882336.4811006-19228-224505638274806/AnsiballZ_service_facts.py <<< 18699 1726882336.61790: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882336.4811006-19228-224505638274806/AnsiballZ_service_facts.py" <<< 18699 1726882336.61814: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 18699 1726882336.61842: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmpd54ttx18" to remote "/root/.ansible/tmp/ansible-tmp-1726882336.4811006-19228-224505638274806/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882336.4811006-19228-224505638274806/AnsiballZ_service_facts.py" <<< 18699 1726882336.62703: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882336.62732: stderr chunk (state=3): >>><<< 18699 1726882336.62740: stdout chunk (state=3): >>><<< 18699 1726882336.62979: done transferring module to remote 18699 1726882336.62982: _low_level_execute_command(): starting 18699 1726882336.62984: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882336.4811006-19228-224505638274806/ /root/.ansible/tmp/ansible-tmp-1726882336.4811006-19228-224505638274806/AnsiballZ_service_facts.py && sleep 0' 18699 1726882336.64312: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882336.64409: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882336.64436: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882336.64508: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882336.66496: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882336.66500: stdout chunk (state=3): >>><<< 18699 1726882336.66503: stderr chunk (state=3): >>><<< 18699 1726882336.66531: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882336.66534: _low_level_execute_command(): starting 18699 1726882336.66537: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882336.4811006-19228-224505638274806/AnsiballZ_service_facts.py && sleep 0' 18699 1726882336.67179: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882336.67199: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882336.67213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882336.67233: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18699 1726882336.67306: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882336.67342: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882336.67371: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882336.67430: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882336.67508: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882338.18767: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"<<< 18699 1726882338.18824: stdout chunk (state=3): >>>}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 18699 1726882338.20803: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 18699 1726882338.20806: stdout chunk (state=3): >>><<< 18699 1726882338.20809: stderr chunk (state=3): >>><<< 18699 1726882338.20813: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 18699 1726882338.22489: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882336.4811006-19228-224505638274806/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18699 1726882338.22576: _low_level_execute_command(): starting 18699 1726882338.22585: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882336.4811006-19228-224505638274806/ > /dev/null 2>&1 && sleep 0' 18699 1726882338.24110: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882338.24221: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882338.24332: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882338.24446: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882338.24460: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882338.24531: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882338.26395: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882338.26399: stdout chunk (state=3): >>><<< 18699 1726882338.26402: stderr chunk (state=3): >>><<< 18699 1726882338.26600: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882338.26604: handler run complete 18699 1726882338.26973: variable 'ansible_facts' from source: unknown 18699 1726882338.30001: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882338.30967: variable 'ansible_facts' from source: unknown 18699 1726882338.31395: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882338.32019: attempt loop complete, returning result 18699 1726882338.32079: _execute() done 18699 1726882338.32091: dumping result to json 18699 1726882338.32243: done dumping result, returning 18699 1726882338.32246: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running [12673a56-9f93-1ce6-d207-00000000027e] 18699 1726882338.32386: sending task result for task 12673a56-9f93-1ce6-d207-00000000027e ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18699 1726882338.34734: no more pending results, returning what we have 18699 1726882338.34738: results queue empty 18699 1726882338.34738: checking for any_errors_fatal 18699 1726882338.34743: done checking for any_errors_fatal 18699 1726882338.34744: checking for max_fail_percentage 18699 1726882338.34745: done checking for max_fail_percentage 18699 1726882338.34748: checking to see if all hosts have failed and the running result is not ok 18699 1726882338.34748: done checking to see if all hosts have failed 18699 1726882338.34749: getting the remaining hosts for this loop 18699 1726882338.34751: done getting the remaining hosts for this loop 18699 1726882338.34755: getting the next task for host managed_node1 18699 1726882338.34760: done getting next task for host managed_node1 18699 1726882338.34764: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 18699 1726882338.34767: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882338.34778: getting variables 18699 1726882338.34779: in VariableManager get_vars() 18699 1726882338.34821: Calling all_inventory to load vars for managed_node1 18699 1726882338.34825: Calling groups_inventory to load vars for managed_node1 18699 1726882338.34827: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882338.34853: done sending task result for task 12673a56-9f93-1ce6-d207-00000000027e 18699 1726882338.34856: WORKER PROCESS EXITING 18699 1726882338.35070: Calling all_plugins_play to load vars for managed_node1 18699 1726882338.35075: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882338.35079: Calling groups_plugins_play to load vars for managed_node1 18699 1726882338.35691: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882338.36384: done with get_vars() 18699 1726882338.36400: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:32:18 -0400 (0:00:01.921) 0:00:11.961 ****** 18699 1726882338.36506: entering _queue_task() for managed_node1/package_facts 18699 1726882338.36514: Creating lock for package_facts 18699 1726882338.36964: worker is 1 (out of 1 available) 18699 1726882338.36978: exiting _queue_task() for managed_node1/package_facts 18699 1726882338.36991: done queuing things up, now waiting for results queue to drain 18699 1726882338.36992: waiting for pending results... 18699 1726882338.37410: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 18699 1726882338.37432: in run() - task 12673a56-9f93-1ce6-d207-00000000027f 18699 1726882338.37452: variable 'ansible_search_path' from source: unknown 18699 1726882338.37461: variable 'ansible_search_path' from source: unknown 18699 1726882338.37507: calling self._execute() 18699 1726882338.37617: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882338.37623: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882338.37628: variable 'omit' from source: magic vars 18699 1726882338.37995: variable 'ansible_distribution_major_version' from source: facts 18699 1726882338.38052: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882338.38056: variable 'omit' from source: magic vars 18699 1726882338.38160: variable 'omit' from source: magic vars 18699 1726882338.38163: variable 'omit' from source: magic vars 18699 1726882338.38169: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18699 1726882338.38209: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18699 1726882338.38235: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18699 1726882338.38257: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882338.38281: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882338.38319: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18699 1726882338.38329: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882338.38337: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882338.38487: Set connection var ansible_connection to ssh 18699 1726882338.38492: Set connection var ansible_pipelining to False 18699 1726882338.38496: Set connection var ansible_shell_executable to /bin/sh 18699 1726882338.38500: Set connection var ansible_timeout to 10 18699 1726882338.38503: Set connection var ansible_shell_type to sh 18699 1726882338.38505: Set connection var ansible_module_compression to ZIP_DEFLATED 18699 1726882338.38572: variable 'ansible_shell_executable' from source: unknown 18699 1726882338.38575: variable 'ansible_connection' from source: unknown 18699 1726882338.38578: variable 'ansible_module_compression' from source: unknown 18699 1726882338.38580: variable 'ansible_shell_type' from source: unknown 18699 1726882338.38582: variable 'ansible_shell_executable' from source: unknown 18699 1726882338.38583: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882338.38585: variable 'ansible_pipelining' from source: unknown 18699 1726882338.38587: variable 'ansible_timeout' from source: unknown 18699 1726882338.38595: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882338.38846: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18699 1726882338.38862: variable 'omit' from source: magic vars 18699 1726882338.38870: starting attempt loop 18699 1726882338.38916: running the handler 18699 1726882338.38920: _low_level_execute_command(): starting 18699 1726882338.38922: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18699 1726882338.39690: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882338.39735: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882338.39756: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882338.39781: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882338.39843: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882338.41445: stdout chunk (state=3): >>>/root <<< 18699 1726882338.41686: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882338.41690: stdout chunk (state=3): >>><<< 18699 1726882338.41692: stderr chunk (state=3): >>><<< 18699 1726882338.41727: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882338.41782: _low_level_execute_command(): starting 18699 1726882338.41786: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882338.4173534-19303-59443540386505 `" && echo ansible-tmp-1726882338.4173534-19303-59443540386505="` echo /root/.ansible/tmp/ansible-tmp-1726882338.4173534-19303-59443540386505 `" ) && sleep 0' 18699 1726882338.43011: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882338.43014: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882338.43017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882338.43020: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18699 1726882338.43023: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 18699 1726882338.43085: stderr chunk (state=3): >>>debug2: match not found <<< 18699 1726882338.43099: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882338.43102: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18699 1726882338.43105: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 18699 1726882338.43107: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 18699 1726882338.43109: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882338.43112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882338.43114: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18699 1726882338.43116: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 18699 1726882338.43118: stderr chunk (state=3): >>>debug2: match found <<< 18699 1726882338.43505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882338.43508: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882338.43567: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882338.45461: stdout chunk (state=3): >>>ansible-tmp-1726882338.4173534-19303-59443540386505=/root/.ansible/tmp/ansible-tmp-1726882338.4173534-19303-59443540386505 <<< 18699 1726882338.45549: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882338.45560: stdout chunk (state=3): >>><<< 18699 1726882338.45571: stderr chunk (state=3): >>><<< 18699 1726882338.45591: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882338.4173534-19303-59443540386505=/root/.ansible/tmp/ansible-tmp-1726882338.4173534-19303-59443540386505 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882338.46001: variable 'ansible_module_compression' from source: unknown 18699 1726882338.46004: ANSIBALLZ: Using lock for package_facts 18699 1726882338.46007: ANSIBALLZ: Acquiring lock 18699 1726882338.46009: ANSIBALLZ: Lock acquired: 140254443147904 18699 1726882338.46011: ANSIBALLZ: Creating module 18699 1726882338.91432: ANSIBALLZ: Writing module into payload 18699 1726882338.91570: ANSIBALLZ: Writing module 18699 1726882338.91601: ANSIBALLZ: Renaming module 18699 1726882338.91611: ANSIBALLZ: Done creating module 18699 1726882338.91648: variable 'ansible_facts' from source: unknown 18699 1726882338.91926: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882338.4173534-19303-59443540386505/AnsiballZ_package_facts.py 18699 1726882338.92128: Sending initial data 18699 1726882338.92131: Sent initial data (161 bytes) 18699 1726882338.92809: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882338.92873: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882338.92894: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882338.92911: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882338.93003: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882338.94597: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 18699 1726882338.94601: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18699 1726882338.94632: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18699 1726882338.94670: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmpbw8arx8t /root/.ansible/tmp/ansible-tmp-1726882338.4173534-19303-59443540386505/AnsiballZ_package_facts.py <<< 18699 1726882338.94678: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882338.4173534-19303-59443540386505/AnsiballZ_package_facts.py" <<< 18699 1726882338.94712: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmpbw8arx8t" to remote "/root/.ansible/tmp/ansible-tmp-1726882338.4173534-19303-59443540386505/AnsiballZ_package_facts.py" <<< 18699 1726882338.94722: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882338.4173534-19303-59443540386505/AnsiballZ_package_facts.py" <<< 18699 1726882338.95754: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882338.95826: stderr chunk (state=3): >>><<< 18699 1726882338.95830: stdout chunk (state=3): >>><<< 18699 1726882338.95832: done transferring module to remote 18699 1726882338.95837: _low_level_execute_command(): starting 18699 1726882338.95876: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882338.4173534-19303-59443540386505/ /root/.ansible/tmp/ansible-tmp-1726882338.4173534-19303-59443540386505/AnsiballZ_package_facts.py && sleep 0' 18699 1726882338.96501: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882338.96504: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882338.96510: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882338.96512: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18699 1726882338.96522: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 18699 1726882338.96524: stderr chunk (state=3): >>>debug2: match not found <<< 18699 1726882338.96526: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882338.96671: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882338.96674: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882338.96677: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882338.96691: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882338.96770: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882338.98528: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882338.98531: stdout chunk (state=3): >>><<< 18699 1726882338.98536: stderr chunk (state=3): >>><<< 18699 1726882338.98548: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882338.98550: _low_level_execute_command(): starting 18699 1726882338.98555: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882338.4173534-19303-59443540386505/AnsiballZ_package_facts.py && sleep 0' 18699 1726882338.99138: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882338.99142: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882338.99181: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882338.99249: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882339.43044: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "r<<< 18699 1726882339.43201: stdout chunk (state=3): >>>pm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.<<< 18699 1726882339.43209: stdout chunk (state=3): >>>26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 18699 1726882339.44901: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 18699 1726882339.44920: stdout chunk (state=3): >>><<< 18699 1726882339.44986: stderr chunk (state=3): >>><<< 18699 1726882339.45103: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 18699 1726882339.47034: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882338.4173534-19303-59443540386505/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18699 1726882339.47066: _low_level_execute_command(): starting 18699 1726882339.47077: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882338.4173534-19303-59443540386505/ > /dev/null 2>&1 && sleep 0' 18699 1726882339.47811: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882339.47855: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882339.47882: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882339.47910: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882339.48011: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882339.49782: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882339.49815: stderr chunk (state=3): >>><<< 18699 1726882339.49824: stdout chunk (state=3): >>><<< 18699 1726882339.49842: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882339.49847: handler run complete 18699 1726882339.56158: variable 'ansible_facts' from source: unknown 18699 1726882339.56449: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882339.57547: variable 'ansible_facts' from source: unknown 18699 1726882339.57947: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882339.58672: attempt loop complete, returning result 18699 1726882339.58686: _execute() done 18699 1726882339.58689: dumping result to json 18699 1726882339.58891: done dumping result, returning 18699 1726882339.58899: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [12673a56-9f93-1ce6-d207-00000000027f] 18699 1726882339.58902: sending task result for task 12673a56-9f93-1ce6-d207-00000000027f 18699 1726882339.60866: done sending task result for task 12673a56-9f93-1ce6-d207-00000000027f 18699 1726882339.60870: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18699 1726882339.60914: no more pending results, returning what we have 18699 1726882339.60916: results queue empty 18699 1726882339.60917: checking for any_errors_fatal 18699 1726882339.60919: done checking for any_errors_fatal 18699 1726882339.60919: checking for max_fail_percentage 18699 1726882339.60920: done checking for max_fail_percentage 18699 1726882339.60921: checking to see if all hosts have failed and the running result is not ok 18699 1726882339.60921: done checking to see if all hosts have failed 18699 1726882339.60922: getting the remaining hosts for this loop 18699 1726882339.60923: done getting the remaining hosts for this loop 18699 1726882339.60925: getting the next task for host managed_node1 18699 1726882339.60929: done getting next task for host managed_node1 18699 1726882339.60932: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 18699 1726882339.60933: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882339.60939: getting variables 18699 1726882339.60940: in VariableManager get_vars() 18699 1726882339.60960: Calling all_inventory to load vars for managed_node1 18699 1726882339.60961: Calling groups_inventory to load vars for managed_node1 18699 1726882339.60963: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882339.60969: Calling all_plugins_play to load vars for managed_node1 18699 1726882339.60971: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882339.60972: Calling groups_plugins_play to load vars for managed_node1 18699 1726882339.61864: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882339.63549: done with get_vars() 18699 1726882339.63572: done getting variables 18699 1726882339.63626: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:32:19 -0400 (0:00:01.271) 0:00:13.232 ****** 18699 1726882339.63659: entering _queue_task() for managed_node1/debug 18699 1726882339.63951: worker is 1 (out of 1 available) 18699 1726882339.63964: exiting _queue_task() for managed_node1/debug 18699 1726882339.63978: done queuing things up, now waiting for results queue to drain 18699 1726882339.63980: waiting for pending results... 18699 1726882339.64197: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 18699 1726882339.64271: in run() - task 12673a56-9f93-1ce6-d207-00000000001a 18699 1726882339.64296: variable 'ansible_search_path' from source: unknown 18699 1726882339.64310: variable 'ansible_search_path' from source: unknown 18699 1726882339.64336: calling self._execute() 18699 1726882339.64406: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882339.64411: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882339.64420: variable 'omit' from source: magic vars 18699 1726882339.64718: variable 'ansible_distribution_major_version' from source: facts 18699 1726882339.64727: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882339.64732: variable 'omit' from source: magic vars 18699 1726882339.64759: variable 'omit' from source: magic vars 18699 1726882339.64844: variable 'network_provider' from source: set_fact 18699 1726882339.64853: variable 'omit' from source: magic vars 18699 1726882339.64913: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18699 1726882339.64929: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18699 1726882339.64946: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18699 1726882339.64960: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882339.64985: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882339.65067: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18699 1726882339.65070: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882339.65073: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882339.65151: Set connection var ansible_connection to ssh 18699 1726882339.65154: Set connection var ansible_pipelining to False 18699 1726882339.65156: Set connection var ansible_shell_executable to /bin/sh 18699 1726882339.65159: Set connection var ansible_timeout to 10 18699 1726882339.65161: Set connection var ansible_shell_type to sh 18699 1726882339.65163: Set connection var ansible_module_compression to ZIP_DEFLATED 18699 1726882339.65170: variable 'ansible_shell_executable' from source: unknown 18699 1726882339.65173: variable 'ansible_connection' from source: unknown 18699 1726882339.65175: variable 'ansible_module_compression' from source: unknown 18699 1726882339.65177: variable 'ansible_shell_type' from source: unknown 18699 1726882339.65179: variable 'ansible_shell_executable' from source: unknown 18699 1726882339.65198: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882339.65201: variable 'ansible_pipelining' from source: unknown 18699 1726882339.65203: variable 'ansible_timeout' from source: unknown 18699 1726882339.65225: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882339.65350: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18699 1726882339.65365: variable 'omit' from source: magic vars 18699 1726882339.65397: starting attempt loop 18699 1726882339.65401: running the handler 18699 1726882339.65461: handler run complete 18699 1726882339.65464: attempt loop complete, returning result 18699 1726882339.65467: _execute() done 18699 1726882339.65469: dumping result to json 18699 1726882339.65472: done dumping result, returning 18699 1726882339.65474: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [12673a56-9f93-1ce6-d207-00000000001a] 18699 1726882339.65478: sending task result for task 12673a56-9f93-1ce6-d207-00000000001a 18699 1726882339.65546: done sending task result for task 12673a56-9f93-1ce6-d207-00000000001a 18699 1726882339.65548: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: Using network provider: nm 18699 1726882339.65615: no more pending results, returning what we have 18699 1726882339.65619: results queue empty 18699 1726882339.65619: checking for any_errors_fatal 18699 1726882339.65629: done checking for any_errors_fatal 18699 1726882339.65629: checking for max_fail_percentage 18699 1726882339.65631: done checking for max_fail_percentage 18699 1726882339.65632: checking to see if all hosts have failed and the running result is not ok 18699 1726882339.65632: done checking to see if all hosts have failed 18699 1726882339.65633: getting the remaining hosts for this loop 18699 1726882339.65634: done getting the remaining hosts for this loop 18699 1726882339.65637: getting the next task for host managed_node1 18699 1726882339.65643: done getting next task for host managed_node1 18699 1726882339.65649: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 18699 1726882339.65652: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882339.65661: getting variables 18699 1726882339.65662: in VariableManager get_vars() 18699 1726882339.65698: Calling all_inventory to load vars for managed_node1 18699 1726882339.65701: Calling groups_inventory to load vars for managed_node1 18699 1726882339.65703: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882339.65711: Calling all_plugins_play to load vars for managed_node1 18699 1726882339.65714: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882339.65716: Calling groups_plugins_play to load vars for managed_node1 18699 1726882339.66836: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882339.67786: done with get_vars() 18699 1726882339.67806: done getting variables 18699 1726882339.67859: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:32:19 -0400 (0:00:00.042) 0:00:13.274 ****** 18699 1726882339.67885: entering _queue_task() for managed_node1/fail 18699 1726882339.68116: worker is 1 (out of 1 available) 18699 1726882339.68130: exiting _queue_task() for managed_node1/fail 18699 1726882339.68142: done queuing things up, now waiting for results queue to drain 18699 1726882339.68143: waiting for pending results... 18699 1726882339.68311: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 18699 1726882339.68380: in run() - task 12673a56-9f93-1ce6-d207-00000000001b 18699 1726882339.68389: variable 'ansible_search_path' from source: unknown 18699 1726882339.68397: variable 'ansible_search_path' from source: unknown 18699 1726882339.68424: calling self._execute() 18699 1726882339.68489: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882339.68497: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882339.68505: variable 'omit' from source: magic vars 18699 1726882339.68885: variable 'ansible_distribution_major_version' from source: facts 18699 1726882339.68900: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882339.68960: variable 'network_state' from source: role '' defaults 18699 1726882339.68968: Evaluated conditional (network_state != {}): False 18699 1726882339.68972: when evaluation is False, skipping this task 18699 1726882339.68974: _execute() done 18699 1726882339.68980: dumping result to json 18699 1726882339.68992: done dumping result, returning 18699 1726882339.69000: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12673a56-9f93-1ce6-d207-00000000001b] 18699 1726882339.69003: sending task result for task 12673a56-9f93-1ce6-d207-00000000001b 18699 1726882339.69103: done sending task result for task 12673a56-9f93-1ce6-d207-00000000001b 18699 1726882339.69106: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 18699 1726882339.69156: no more pending results, returning what we have 18699 1726882339.69163: results queue empty 18699 1726882339.69164: checking for any_errors_fatal 18699 1726882339.69170: done checking for any_errors_fatal 18699 1726882339.69170: checking for max_fail_percentage 18699 1726882339.69172: done checking for max_fail_percentage 18699 1726882339.69173: checking to see if all hosts have failed and the running result is not ok 18699 1726882339.69174: done checking to see if all hosts have failed 18699 1726882339.69175: getting the remaining hosts for this loop 18699 1726882339.69176: done getting the remaining hosts for this loop 18699 1726882339.69179: getting the next task for host managed_node1 18699 1726882339.69184: done getting next task for host managed_node1 18699 1726882339.69188: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 18699 1726882339.69190: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882339.69208: getting variables 18699 1726882339.69209: in VariableManager get_vars() 18699 1726882339.69242: Calling all_inventory to load vars for managed_node1 18699 1726882339.69245: Calling groups_inventory to load vars for managed_node1 18699 1726882339.69248: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882339.69257: Calling all_plugins_play to load vars for managed_node1 18699 1726882339.69259: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882339.69261: Calling groups_plugins_play to load vars for managed_node1 18699 1726882339.70171: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882339.71226: done with get_vars() 18699 1726882339.71244: done getting variables 18699 1726882339.71302: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:32:19 -0400 (0:00:00.034) 0:00:13.309 ****** 18699 1726882339.71325: entering _queue_task() for managed_node1/fail 18699 1726882339.71571: worker is 1 (out of 1 available) 18699 1726882339.71585: exiting _queue_task() for managed_node1/fail 18699 1726882339.71597: done queuing things up, now waiting for results queue to drain 18699 1726882339.71599: waiting for pending results... 18699 1726882339.71783: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 18699 1726882339.71856: in run() - task 12673a56-9f93-1ce6-d207-00000000001c 18699 1726882339.71867: variable 'ansible_search_path' from source: unknown 18699 1726882339.71870: variable 'ansible_search_path' from source: unknown 18699 1726882339.71902: calling self._execute() 18699 1726882339.71999: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882339.72003: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882339.72024: variable 'omit' from source: magic vars 18699 1726882339.72378: variable 'ansible_distribution_major_version' from source: facts 18699 1726882339.72387: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882339.72485: variable 'network_state' from source: role '' defaults 18699 1726882339.72496: Evaluated conditional (network_state != {}): False 18699 1726882339.72499: when evaluation is False, skipping this task 18699 1726882339.72516: _execute() done 18699 1726882339.72528: dumping result to json 18699 1726882339.72531: done dumping result, returning 18699 1726882339.72534: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12673a56-9f93-1ce6-d207-00000000001c] 18699 1726882339.72537: sending task result for task 12673a56-9f93-1ce6-d207-00000000001c 18699 1726882339.72606: done sending task result for task 12673a56-9f93-1ce6-d207-00000000001c 18699 1726882339.72609: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 18699 1726882339.72662: no more pending results, returning what we have 18699 1726882339.72665: results queue empty 18699 1726882339.72666: checking for any_errors_fatal 18699 1726882339.72672: done checking for any_errors_fatal 18699 1726882339.72673: checking for max_fail_percentage 18699 1726882339.72674: done checking for max_fail_percentage 18699 1726882339.72675: checking to see if all hosts have failed and the running result is not ok 18699 1726882339.72676: done checking to see if all hosts have failed 18699 1726882339.72676: getting the remaining hosts for this loop 18699 1726882339.72678: done getting the remaining hosts for this loop 18699 1726882339.72681: getting the next task for host managed_node1 18699 1726882339.72686: done getting next task for host managed_node1 18699 1726882339.72690: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 18699 1726882339.72692: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882339.72707: getting variables 18699 1726882339.72709: in VariableManager get_vars() 18699 1726882339.72743: Calling all_inventory to load vars for managed_node1 18699 1726882339.72745: Calling groups_inventory to load vars for managed_node1 18699 1726882339.72747: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882339.72756: Calling all_plugins_play to load vars for managed_node1 18699 1726882339.72758: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882339.72761: Calling groups_plugins_play to load vars for managed_node1 18699 1726882339.73753: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882339.74726: done with get_vars() 18699 1726882339.74740: done getting variables 18699 1726882339.74779: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:32:19 -0400 (0:00:00.034) 0:00:13.344 ****** 18699 1726882339.74805: entering _queue_task() for managed_node1/fail 18699 1726882339.75011: worker is 1 (out of 1 available) 18699 1726882339.75023: exiting _queue_task() for managed_node1/fail 18699 1726882339.75033: done queuing things up, now waiting for results queue to drain 18699 1726882339.75034: waiting for pending results... 18699 1726882339.75207: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 18699 1726882339.75277: in run() - task 12673a56-9f93-1ce6-d207-00000000001d 18699 1726882339.75288: variable 'ansible_search_path' from source: unknown 18699 1726882339.75291: variable 'ansible_search_path' from source: unknown 18699 1726882339.75321: calling self._execute() 18699 1726882339.75388: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882339.75392: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882339.75403: variable 'omit' from source: magic vars 18699 1726882339.75664: variable 'ansible_distribution_major_version' from source: facts 18699 1726882339.75673: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882339.75785: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18699 1726882339.77445: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18699 1726882339.77507: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18699 1726882339.77536: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18699 1726882339.77562: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18699 1726882339.77581: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18699 1726882339.77653: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882339.77674: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882339.77691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882339.77722: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882339.77732: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882339.77801: variable 'ansible_distribution_major_version' from source: facts 18699 1726882339.77814: Evaluated conditional (ansible_distribution_major_version | int > 9): True 18699 1726882339.77889: variable 'ansible_distribution' from source: facts 18699 1726882339.77897: variable '__network_rh_distros' from source: role '' defaults 18699 1726882339.77903: Evaluated conditional (ansible_distribution in __network_rh_distros): True 18699 1726882339.78067: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882339.78087: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882339.78107: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882339.78133: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882339.78144: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882339.78175: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882339.78198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882339.78214: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882339.78255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882339.78282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882339.78379: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882339.78385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882339.78388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882339.78452: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882339.78455: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882339.78885: variable 'network_connections' from source: play vars 18699 1726882339.78888: variable 'interface' from source: set_fact 18699 1726882339.78890: variable 'interface' from source: set_fact 18699 1726882339.78892: variable 'interface' from source: set_fact 18699 1726882339.79108: variable 'interface' from source: set_fact 18699 1726882339.79111: variable 'network_state' from source: role '' defaults 18699 1726882339.79113: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18699 1726882339.79301: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18699 1726882339.79362: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18699 1726882339.79398: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18699 1726882339.79431: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18699 1726882339.79542: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18699 1726882339.79616: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18699 1726882339.79662: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882339.79709: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18699 1726882339.79724: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 18699 1726882339.79728: when evaluation is False, skipping this task 18699 1726882339.79730: _execute() done 18699 1726882339.79733: dumping result to json 18699 1726882339.79735: done dumping result, returning 18699 1726882339.79742: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12673a56-9f93-1ce6-d207-00000000001d] 18699 1726882339.79745: sending task result for task 12673a56-9f93-1ce6-d207-00000000001d skipping: [managed_node1] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 18699 1726882339.79873: no more pending results, returning what we have 18699 1726882339.79876: results queue empty 18699 1726882339.79877: checking for any_errors_fatal 18699 1726882339.79881: done checking for any_errors_fatal 18699 1726882339.79889: checking for max_fail_percentage 18699 1726882339.79896: done checking for max_fail_percentage 18699 1726882339.79897: checking to see if all hosts have failed and the running result is not ok 18699 1726882339.79898: done checking to see if all hosts have failed 18699 1726882339.79898: getting the remaining hosts for this loop 18699 1726882339.79900: done getting the remaining hosts for this loop 18699 1726882339.79903: getting the next task for host managed_node1 18699 1726882339.79910: done getting next task for host managed_node1 18699 1726882339.79913: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 18699 1726882339.79915: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882339.79957: done sending task result for task 12673a56-9f93-1ce6-d207-00000000001d 18699 1726882339.79960: WORKER PROCESS EXITING 18699 1726882339.79978: getting variables 18699 1726882339.79980: in VariableManager get_vars() 18699 1726882339.80040: Calling all_inventory to load vars for managed_node1 18699 1726882339.80043: Calling groups_inventory to load vars for managed_node1 18699 1726882339.80045: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882339.80057: Calling all_plugins_play to load vars for managed_node1 18699 1726882339.80061: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882339.80065: Calling groups_plugins_play to load vars for managed_node1 18699 1726882339.81498: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882339.84177: done with get_vars() 18699 1726882339.84206: done getting variables 18699 1726882339.84732: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:32:19 -0400 (0:00:00.100) 0:00:13.444 ****** 18699 1726882339.84821: entering _queue_task() for managed_node1/dnf 18699 1726882339.86456: worker is 1 (out of 1 available) 18699 1726882339.86468: exiting _queue_task() for managed_node1/dnf 18699 1726882339.86479: done queuing things up, now waiting for results queue to drain 18699 1726882339.86480: waiting for pending results... 18699 1726882339.87255: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 18699 1726882339.87260: in run() - task 12673a56-9f93-1ce6-d207-00000000001e 18699 1726882339.87264: variable 'ansible_search_path' from source: unknown 18699 1726882339.87268: variable 'ansible_search_path' from source: unknown 18699 1726882339.87726: calling self._execute() 18699 1726882339.87902: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882339.87906: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882339.87908: variable 'omit' from source: magic vars 18699 1726882339.89134: variable 'ansible_distribution_major_version' from source: facts 18699 1726882339.89138: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882339.89822: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18699 1726882339.94112: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18699 1726882339.94206: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18699 1726882339.94210: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18699 1726882339.94212: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18699 1726882339.94214: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18699 1726882339.94391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882339.94422: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882339.94508: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882339.94548: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882339.94775: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882339.94829: variable 'ansible_distribution' from source: facts 18699 1726882339.94833: variable 'ansible_distribution_major_version' from source: facts 18699 1726882339.94873: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 18699 1726882339.95300: variable '__network_wireless_connections_defined' from source: role '' defaults 18699 1726882339.95304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882339.95307: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882339.95310: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882339.95417: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882339.95431: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882339.95547: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882339.95690: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882339.95716: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882339.95974: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882339.95978: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882339.95980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882339.96033: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882339.96059: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882339.96092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882339.96303: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882339.96541: variable 'network_connections' from source: play vars 18699 1726882339.96559: variable 'interface' from source: set_fact 18699 1726882339.96628: variable 'interface' from source: set_fact 18699 1726882339.96636: variable 'interface' from source: set_fact 18699 1726882339.96875: variable 'interface' from source: set_fact 18699 1726882339.97099: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18699 1726882339.97451: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18699 1726882339.97558: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18699 1726882339.97644: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18699 1726882339.97712: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18699 1726882339.97825: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18699 1726882339.97921: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18699 1726882339.97971: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882339.98108: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18699 1726882339.98205: variable '__network_team_connections_defined' from source: role '' defaults 18699 1726882339.98599: variable 'network_connections' from source: play vars 18699 1726882339.98722: variable 'interface' from source: set_fact 18699 1726882339.98845: variable 'interface' from source: set_fact 18699 1726882339.98856: variable 'interface' from source: set_fact 18699 1726882339.98921: variable 'interface' from source: set_fact 18699 1726882339.99098: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 18699 1726882339.99107: when evaluation is False, skipping this task 18699 1726882339.99113: _execute() done 18699 1726882339.99119: dumping result to json 18699 1726882339.99126: done dumping result, returning 18699 1726882339.99136: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12673a56-9f93-1ce6-d207-00000000001e] 18699 1726882339.99207: sending task result for task 12673a56-9f93-1ce6-d207-00000000001e 18699 1726882339.99718: done sending task result for task 12673a56-9f93-1ce6-d207-00000000001e 18699 1726882339.99722: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 18699 1726882339.99764: no more pending results, returning what we have 18699 1726882339.99767: results queue empty 18699 1726882339.99768: checking for any_errors_fatal 18699 1726882339.99773: done checking for any_errors_fatal 18699 1726882339.99774: checking for max_fail_percentage 18699 1726882339.99776: done checking for max_fail_percentage 18699 1726882339.99777: checking to see if all hosts have failed and the running result is not ok 18699 1726882339.99777: done checking to see if all hosts have failed 18699 1726882339.99778: getting the remaining hosts for this loop 18699 1726882339.99779: done getting the remaining hosts for this loop 18699 1726882339.99783: getting the next task for host managed_node1 18699 1726882339.99788: done getting next task for host managed_node1 18699 1726882339.99792: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 18699 1726882339.99799: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882339.99813: getting variables 18699 1726882339.99815: in VariableManager get_vars() 18699 1726882339.99850: Calling all_inventory to load vars for managed_node1 18699 1726882339.99853: Calling groups_inventory to load vars for managed_node1 18699 1726882339.99855: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882339.99864: Calling all_plugins_play to load vars for managed_node1 18699 1726882339.99866: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882339.99870: Calling groups_plugins_play to load vars for managed_node1 18699 1726882340.02519: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882340.05000: done with get_vars() 18699 1726882340.05036: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 18699 1726882340.05116: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:32:20 -0400 (0:00:00.203) 0:00:13.647 ****** 18699 1726882340.05154: entering _queue_task() for managed_node1/yum 18699 1726882340.05156: Creating lock for yum 18699 1726882340.05850: worker is 1 (out of 1 available) 18699 1726882340.05862: exiting _queue_task() for managed_node1/yum 18699 1726882340.05873: done queuing things up, now waiting for results queue to drain 18699 1726882340.05873: waiting for pending results... 18699 1726882340.06249: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 18699 1726882340.06344: in run() - task 12673a56-9f93-1ce6-d207-00000000001f 18699 1726882340.06382: variable 'ansible_search_path' from source: unknown 18699 1726882340.06390: variable 'ansible_search_path' from source: unknown 18699 1726882340.06434: calling self._execute() 18699 1726882340.06636: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882340.06640: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882340.06643: variable 'omit' from source: magic vars 18699 1726882340.07182: variable 'ansible_distribution_major_version' from source: facts 18699 1726882340.07400: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882340.07562: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18699 1726882340.12626: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18699 1726882340.12980: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18699 1726882340.12984: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18699 1726882340.13044: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18699 1726882340.13179: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18699 1726882340.13705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882340.13708: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882340.13711: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882340.13921: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882340.13925: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882340.14049: variable 'ansible_distribution_major_version' from source: facts 18699 1726882340.14123: Evaluated conditional (ansible_distribution_major_version | int < 8): False 18699 1726882340.14142: when evaluation is False, skipping this task 18699 1726882340.14356: _execute() done 18699 1726882340.14359: dumping result to json 18699 1726882340.14362: done dumping result, returning 18699 1726882340.14365: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12673a56-9f93-1ce6-d207-00000000001f] 18699 1726882340.14367: sending task result for task 12673a56-9f93-1ce6-d207-00000000001f skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 18699 1726882340.14609: no more pending results, returning what we have 18699 1726882340.14613: results queue empty 18699 1726882340.14614: checking for any_errors_fatal 18699 1726882340.14620: done checking for any_errors_fatal 18699 1726882340.14621: checking for max_fail_percentage 18699 1726882340.14623: done checking for max_fail_percentage 18699 1726882340.14624: checking to see if all hosts have failed and the running result is not ok 18699 1726882340.14625: done checking to see if all hosts have failed 18699 1726882340.14626: getting the remaining hosts for this loop 18699 1726882340.14627: done getting the remaining hosts for this loop 18699 1726882340.14631: getting the next task for host managed_node1 18699 1726882340.14637: done getting next task for host managed_node1 18699 1726882340.14641: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 18699 1726882340.14643: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882340.14657: getting variables 18699 1726882340.14659: in VariableManager get_vars() 18699 1726882340.14799: Calling all_inventory to load vars for managed_node1 18699 1726882340.14803: Calling groups_inventory to load vars for managed_node1 18699 1726882340.14812: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882340.14819: done sending task result for task 12673a56-9f93-1ce6-d207-00000000001f 18699 1726882340.14822: WORKER PROCESS EXITING 18699 1726882340.14833: Calling all_plugins_play to load vars for managed_node1 18699 1726882340.14836: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882340.14839: Calling groups_plugins_play to load vars for managed_node1 18699 1726882340.25413: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882340.27172: done with get_vars() 18699 1726882340.27207: done getting variables 18699 1726882340.27267: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:32:20 -0400 (0:00:00.221) 0:00:13.869 ****** 18699 1726882340.27298: entering _queue_task() for managed_node1/fail 18699 1726882340.27662: worker is 1 (out of 1 available) 18699 1726882340.27675: exiting _queue_task() for managed_node1/fail 18699 1726882340.27809: done queuing things up, now waiting for results queue to drain 18699 1726882340.27810: waiting for pending results... 18699 1726882340.28022: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 18699 1726882340.28266: in run() - task 12673a56-9f93-1ce6-d207-000000000020 18699 1726882340.28269: variable 'ansible_search_path' from source: unknown 18699 1726882340.28272: variable 'ansible_search_path' from source: unknown 18699 1726882340.28322: calling self._execute() 18699 1726882340.28586: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882340.28590: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882340.28599: variable 'omit' from source: magic vars 18699 1726882340.29021: variable 'ansible_distribution_major_version' from source: facts 18699 1726882340.29043: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882340.29176: variable '__network_wireless_connections_defined' from source: role '' defaults 18699 1726882340.29428: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18699 1726882340.31018: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18699 1726882340.31069: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18699 1726882340.31102: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18699 1726882340.31128: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18699 1726882340.31148: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18699 1726882340.31211: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882340.31233: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882340.31250: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882340.31276: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882340.31286: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882340.31328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882340.31348: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882340.31395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882340.31420: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882340.31439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882340.31478: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882340.31499: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882340.31516: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882340.31569: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882340.31585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882340.31901: variable 'network_connections' from source: play vars 18699 1726882340.31904: variable 'interface' from source: set_fact 18699 1726882340.31907: variable 'interface' from source: set_fact 18699 1726882340.31910: variable 'interface' from source: set_fact 18699 1726882340.31912: variable 'interface' from source: set_fact 18699 1726882340.31943: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18699 1726882340.32118: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18699 1726882340.32151: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18699 1726882340.32179: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18699 1726882340.32207: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18699 1726882340.32335: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18699 1726882340.32338: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18699 1726882340.32400: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882340.32404: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18699 1726882340.32436: variable '__network_team_connections_defined' from source: role '' defaults 18699 1726882340.32823: variable 'network_connections' from source: play vars 18699 1726882340.32827: variable 'interface' from source: set_fact 18699 1726882340.32830: variable 'interface' from source: set_fact 18699 1726882340.32832: variable 'interface' from source: set_fact 18699 1726882340.32872: variable 'interface' from source: set_fact 18699 1726882340.32912: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 18699 1726882340.32916: when evaluation is False, skipping this task 18699 1726882340.32918: _execute() done 18699 1726882340.32921: dumping result to json 18699 1726882340.32925: done dumping result, returning 18699 1726882340.32927: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12673a56-9f93-1ce6-d207-000000000020] 18699 1726882340.32936: sending task result for task 12673a56-9f93-1ce6-d207-000000000020 skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 18699 1726882340.33224: no more pending results, returning what we have 18699 1726882340.33228: results queue empty 18699 1726882340.33229: checking for any_errors_fatal 18699 1726882340.33236: done checking for any_errors_fatal 18699 1726882340.33236: checking for max_fail_percentage 18699 1726882340.33238: done checking for max_fail_percentage 18699 1726882340.33238: checking to see if all hosts have failed and the running result is not ok 18699 1726882340.33239: done checking to see if all hosts have failed 18699 1726882340.33240: getting the remaining hosts for this loop 18699 1726882340.33241: done getting the remaining hosts for this loop 18699 1726882340.33244: getting the next task for host managed_node1 18699 1726882340.33249: done getting next task for host managed_node1 18699 1726882340.33252: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 18699 1726882340.33254: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882340.33269: getting variables 18699 1726882340.33270: in VariableManager get_vars() 18699 1726882340.33309: Calling all_inventory to load vars for managed_node1 18699 1726882340.33313: Calling groups_inventory to load vars for managed_node1 18699 1726882340.33315: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882340.33325: Calling all_plugins_play to load vars for managed_node1 18699 1726882340.33327: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882340.33330: Calling groups_plugins_play to load vars for managed_node1 18699 1726882340.33920: done sending task result for task 12673a56-9f93-1ce6-d207-000000000020 18699 1726882340.33923: WORKER PROCESS EXITING 18699 1726882340.34627: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882340.35527: done with get_vars() 18699 1726882340.35541: done getting variables 18699 1726882340.35585: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:32:20 -0400 (0:00:00.083) 0:00:13.952 ****** 18699 1726882340.35610: entering _queue_task() for managed_node1/package 18699 1726882340.35820: worker is 1 (out of 1 available) 18699 1726882340.35833: exiting _queue_task() for managed_node1/package 18699 1726882340.35844: done queuing things up, now waiting for results queue to drain 18699 1726882340.35845: waiting for pending results... 18699 1726882340.36008: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 18699 1726882340.36067: in run() - task 12673a56-9f93-1ce6-d207-000000000021 18699 1726882340.36081: variable 'ansible_search_path' from source: unknown 18699 1726882340.36085: variable 'ansible_search_path' from source: unknown 18699 1726882340.36113: calling self._execute() 18699 1726882340.36181: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882340.36186: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882340.36203: variable 'omit' from source: magic vars 18699 1726882340.36462: variable 'ansible_distribution_major_version' from source: facts 18699 1726882340.36481: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882340.36707: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18699 1726882340.36920: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18699 1726882340.36951: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18699 1726882340.36982: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18699 1726882340.37050: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18699 1726882340.37150: variable 'network_packages' from source: role '' defaults 18699 1726882340.37248: variable '__network_provider_setup' from source: role '' defaults 18699 1726882340.37260: variable '__network_service_name_default_nm' from source: role '' defaults 18699 1726882340.37327: variable '__network_service_name_default_nm' from source: role '' defaults 18699 1726882340.37336: variable '__network_packages_default_nm' from source: role '' defaults 18699 1726882340.37483: variable '__network_packages_default_nm' from source: role '' defaults 18699 1726882340.37566: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18699 1726882340.39120: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18699 1726882340.39160: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18699 1726882340.39186: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18699 1726882340.39218: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18699 1726882340.39237: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18699 1726882340.39292: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882340.39316: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882340.39336: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882340.39363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882340.39373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882340.39407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882340.39423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882340.39444: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882340.39468: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882340.39479: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882340.39615: variable '__network_packages_default_gobject_packages' from source: role '' defaults 18699 1726882340.39690: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882340.39720: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882340.39736: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882340.39764: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882340.39776: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882340.39837: variable 'ansible_python' from source: facts 18699 1726882340.39856: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 18699 1726882340.39915: variable '__network_wpa_supplicant_required' from source: role '' defaults 18699 1726882340.39970: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 18699 1726882340.40054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882340.40070: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882340.40091: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882340.40118: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882340.40129: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882340.40159: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882340.40178: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882340.40200: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882340.40226: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882340.40236: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882340.40330: variable 'network_connections' from source: play vars 18699 1726882340.40335: variable 'interface' from source: set_fact 18699 1726882340.40405: variable 'interface' from source: set_fact 18699 1726882340.40413: variable 'interface' from source: set_fact 18699 1726882340.40480: variable 'interface' from source: set_fact 18699 1726882340.40536: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18699 1726882340.40553: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18699 1726882340.40573: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882340.40597: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18699 1726882340.40650: variable '__network_wireless_connections_defined' from source: role '' defaults 18699 1726882340.40880: variable 'network_connections' from source: play vars 18699 1726882340.40883: variable 'interface' from source: set_fact 18699 1726882340.40968: variable 'interface' from source: set_fact 18699 1726882340.40971: variable 'interface' from source: set_fact 18699 1726882340.41071: variable 'interface' from source: set_fact 18699 1726882340.41121: variable '__network_packages_default_wireless' from source: role '' defaults 18699 1726882340.41209: variable '__network_wireless_connections_defined' from source: role '' defaults 18699 1726882340.41461: variable 'network_connections' from source: play vars 18699 1726882340.41464: variable 'interface' from source: set_fact 18699 1726882340.41543: variable 'interface' from source: set_fact 18699 1726882340.41546: variable 'interface' from source: set_fact 18699 1726882340.41611: variable 'interface' from source: set_fact 18699 1726882340.41630: variable '__network_packages_default_team' from source: role '' defaults 18699 1726882340.41692: variable '__network_team_connections_defined' from source: role '' defaults 18699 1726882340.41886: variable 'network_connections' from source: play vars 18699 1726882340.41889: variable 'interface' from source: set_fact 18699 1726882340.41936: variable 'interface' from source: set_fact 18699 1726882340.41942: variable 'interface' from source: set_fact 18699 1726882340.41987: variable 'interface' from source: set_fact 18699 1726882340.42055: variable '__network_service_name_default_initscripts' from source: role '' defaults 18699 1726882340.42107: variable '__network_service_name_default_initscripts' from source: role '' defaults 18699 1726882340.42110: variable '__network_packages_default_initscripts' from source: role '' defaults 18699 1726882340.42187: variable '__network_packages_default_initscripts' from source: role '' defaults 18699 1726882340.42358: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 18699 1726882340.42765: variable 'network_connections' from source: play vars 18699 1726882340.42768: variable 'interface' from source: set_fact 18699 1726882340.42848: variable 'interface' from source: set_fact 18699 1726882340.42851: variable 'interface' from source: set_fact 18699 1726882340.42914: variable 'interface' from source: set_fact 18699 1726882340.42927: variable 'ansible_distribution' from source: facts 18699 1726882340.42952: variable '__network_rh_distros' from source: role '' defaults 18699 1726882340.42956: variable 'ansible_distribution_major_version' from source: facts 18699 1726882340.42959: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 18699 1726882340.43158: variable 'ansible_distribution' from source: facts 18699 1726882340.43161: variable '__network_rh_distros' from source: role '' defaults 18699 1726882340.43163: variable 'ansible_distribution_major_version' from source: facts 18699 1726882340.43178: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 18699 1726882340.43351: variable 'ansible_distribution' from source: facts 18699 1726882340.43354: variable '__network_rh_distros' from source: role '' defaults 18699 1726882340.43404: variable 'ansible_distribution_major_version' from source: facts 18699 1726882340.43407: variable 'network_provider' from source: set_fact 18699 1726882340.43427: variable 'ansible_facts' from source: unknown 18699 1726882340.44046: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 18699 1726882340.44050: when evaluation is False, skipping this task 18699 1726882340.44057: _execute() done 18699 1726882340.44060: dumping result to json 18699 1726882340.44063: done dumping result, returning 18699 1726882340.44065: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [12673a56-9f93-1ce6-d207-000000000021] 18699 1726882340.44068: sending task result for task 12673a56-9f93-1ce6-d207-000000000021 18699 1726882340.44183: done sending task result for task 12673a56-9f93-1ce6-d207-000000000021 18699 1726882340.44186: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 18699 1726882340.44285: no more pending results, returning what we have 18699 1726882340.44288: results queue empty 18699 1726882340.44289: checking for any_errors_fatal 18699 1726882340.44296: done checking for any_errors_fatal 18699 1726882340.44297: checking for max_fail_percentage 18699 1726882340.44299: done checking for max_fail_percentage 18699 1726882340.44300: checking to see if all hosts have failed and the running result is not ok 18699 1726882340.44300: done checking to see if all hosts have failed 18699 1726882340.44301: getting the remaining hosts for this loop 18699 1726882340.44302: done getting the remaining hosts for this loop 18699 1726882340.44305: getting the next task for host managed_node1 18699 1726882340.44312: done getting next task for host managed_node1 18699 1726882340.44316: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 18699 1726882340.44317: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882340.44330: getting variables 18699 1726882340.44331: in VariableManager get_vars() 18699 1726882340.44365: Calling all_inventory to load vars for managed_node1 18699 1726882340.44368: Calling groups_inventory to load vars for managed_node1 18699 1726882340.44370: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882340.44382: Calling all_plugins_play to load vars for managed_node1 18699 1726882340.44384: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882340.44386: Calling groups_plugins_play to load vars for managed_node1 18699 1726882340.45602: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882340.46653: done with get_vars() 18699 1726882340.46676: done getting variables 18699 1726882340.46727: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:32:20 -0400 (0:00:00.111) 0:00:14.063 ****** 18699 1726882340.46748: entering _queue_task() for managed_node1/package 18699 1726882340.46979: worker is 1 (out of 1 available) 18699 1726882340.46996: exiting _queue_task() for managed_node1/package 18699 1726882340.47009: done queuing things up, now waiting for results queue to drain 18699 1726882340.47010: waiting for pending results... 18699 1726882340.47199: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 18699 1726882340.47272: in run() - task 12673a56-9f93-1ce6-d207-000000000022 18699 1726882340.47286: variable 'ansible_search_path' from source: unknown 18699 1726882340.47289: variable 'ansible_search_path' from source: unknown 18699 1726882340.47320: calling self._execute() 18699 1726882340.47392: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882340.47400: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882340.47403: variable 'omit' from source: magic vars 18699 1726882340.47678: variable 'ansible_distribution_major_version' from source: facts 18699 1726882340.47687: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882340.47771: variable 'network_state' from source: role '' defaults 18699 1726882340.47782: Evaluated conditional (network_state != {}): False 18699 1726882340.47789: when evaluation is False, skipping this task 18699 1726882340.47791: _execute() done 18699 1726882340.47798: dumping result to json 18699 1726882340.47801: done dumping result, returning 18699 1726882340.47804: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12673a56-9f93-1ce6-d207-000000000022] 18699 1726882340.47807: sending task result for task 12673a56-9f93-1ce6-d207-000000000022 18699 1726882340.47890: done sending task result for task 12673a56-9f93-1ce6-d207-000000000022 18699 1726882340.47899: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 18699 1726882340.47946: no more pending results, returning what we have 18699 1726882340.47949: results queue empty 18699 1726882340.47951: checking for any_errors_fatal 18699 1726882340.47959: done checking for any_errors_fatal 18699 1726882340.47959: checking for max_fail_percentage 18699 1726882340.47961: done checking for max_fail_percentage 18699 1726882340.47962: checking to see if all hosts have failed and the running result is not ok 18699 1726882340.47963: done checking to see if all hosts have failed 18699 1726882340.47963: getting the remaining hosts for this loop 18699 1726882340.47965: done getting the remaining hosts for this loop 18699 1726882340.47968: getting the next task for host managed_node1 18699 1726882340.47973: done getting next task for host managed_node1 18699 1726882340.47977: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 18699 1726882340.47979: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882340.47991: getting variables 18699 1726882340.48000: in VariableManager get_vars() 18699 1726882340.48037: Calling all_inventory to load vars for managed_node1 18699 1726882340.48040: Calling groups_inventory to load vars for managed_node1 18699 1726882340.48042: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882340.48050: Calling all_plugins_play to load vars for managed_node1 18699 1726882340.48052: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882340.48054: Calling groups_plugins_play to load vars for managed_node1 18699 1726882340.48816: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882340.49723: done with get_vars() 18699 1726882340.49741: done getting variables 18699 1726882340.49779: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:32:20 -0400 (0:00:00.030) 0:00:14.094 ****** 18699 1726882340.49803: entering _queue_task() for managed_node1/package 18699 1726882340.50000: worker is 1 (out of 1 available) 18699 1726882340.50013: exiting _queue_task() for managed_node1/package 18699 1726882340.50024: done queuing things up, now waiting for results queue to drain 18699 1726882340.50025: waiting for pending results... 18699 1726882340.50189: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 18699 1726882340.50255: in run() - task 12673a56-9f93-1ce6-d207-000000000023 18699 1726882340.50265: variable 'ansible_search_path' from source: unknown 18699 1726882340.50269: variable 'ansible_search_path' from source: unknown 18699 1726882340.50300: calling self._execute() 18699 1726882340.50367: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882340.50371: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882340.50379: variable 'omit' from source: magic vars 18699 1726882340.50644: variable 'ansible_distribution_major_version' from source: facts 18699 1726882340.50652: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882340.50736: variable 'network_state' from source: role '' defaults 18699 1726882340.50744: Evaluated conditional (network_state != {}): False 18699 1726882340.50747: when evaluation is False, skipping this task 18699 1726882340.50750: _execute() done 18699 1726882340.50753: dumping result to json 18699 1726882340.50755: done dumping result, returning 18699 1726882340.50763: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12673a56-9f93-1ce6-d207-000000000023] 18699 1726882340.50767: sending task result for task 12673a56-9f93-1ce6-d207-000000000023 18699 1726882340.50856: done sending task result for task 12673a56-9f93-1ce6-d207-000000000023 18699 1726882340.50859: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 18699 1726882340.50903: no more pending results, returning what we have 18699 1726882340.50907: results queue empty 18699 1726882340.50908: checking for any_errors_fatal 18699 1726882340.50915: done checking for any_errors_fatal 18699 1726882340.50916: checking for max_fail_percentage 18699 1726882340.50917: done checking for max_fail_percentage 18699 1726882340.50918: checking to see if all hosts have failed and the running result is not ok 18699 1726882340.50919: done checking to see if all hosts have failed 18699 1726882340.50919: getting the remaining hosts for this loop 18699 1726882340.50921: done getting the remaining hosts for this loop 18699 1726882340.50924: getting the next task for host managed_node1 18699 1726882340.50928: done getting next task for host managed_node1 18699 1726882340.50931: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 18699 1726882340.50933: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882340.50945: getting variables 18699 1726882340.50947: in VariableManager get_vars() 18699 1726882340.50976: Calling all_inventory to load vars for managed_node1 18699 1726882340.50978: Calling groups_inventory to load vars for managed_node1 18699 1726882340.50980: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882340.50987: Calling all_plugins_play to load vars for managed_node1 18699 1726882340.50989: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882340.50992: Calling groups_plugins_play to load vars for managed_node1 18699 1726882340.52218: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882340.53368: done with get_vars() 18699 1726882340.53383: done getting variables 18699 1726882340.53452: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:32:20 -0400 (0:00:00.036) 0:00:14.130 ****** 18699 1726882340.53473: entering _queue_task() for managed_node1/service 18699 1726882340.53475: Creating lock for service 18699 1726882340.53762: worker is 1 (out of 1 available) 18699 1726882340.53774: exiting _queue_task() for managed_node1/service 18699 1726882340.53788: done queuing things up, now waiting for results queue to drain 18699 1726882340.53790: waiting for pending results... 18699 1726882340.53992: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 18699 1726882340.54059: in run() - task 12673a56-9f93-1ce6-d207-000000000024 18699 1726882340.54069: variable 'ansible_search_path' from source: unknown 18699 1726882340.54072: variable 'ansible_search_path' from source: unknown 18699 1726882340.54103: calling self._execute() 18699 1726882340.54185: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882340.54189: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882340.54204: variable 'omit' from source: magic vars 18699 1726882340.54501: variable 'ansible_distribution_major_version' from source: facts 18699 1726882340.54510: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882340.54603: variable '__network_wireless_connections_defined' from source: role '' defaults 18699 1726882340.54721: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18699 1726882340.56781: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18699 1726882340.56851: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18699 1726882340.56958: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18699 1726882340.56961: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18699 1726882340.56982: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18699 1726882340.57041: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882340.57064: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882340.57081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882340.57112: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882340.57122: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882340.57172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882340.57196: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882340.57216: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882340.57241: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882340.57252: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882340.57279: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882340.57305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882340.57322: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882340.57346: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882340.57356: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882340.57513: variable 'network_connections' from source: play vars 18699 1726882340.57522: variable 'interface' from source: set_fact 18699 1726882340.57572: variable 'interface' from source: set_fact 18699 1726882340.57580: variable 'interface' from source: set_fact 18699 1726882340.57628: variable 'interface' from source: set_fact 18699 1726882340.57676: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18699 1726882340.57795: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18699 1726882340.57824: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18699 1726882340.57846: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18699 1726882340.57870: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18699 1726882340.57903: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18699 1726882340.57918: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18699 1726882340.57937: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882340.57958: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18699 1726882340.58004: variable '__network_team_connections_defined' from source: role '' defaults 18699 1726882340.58164: variable 'network_connections' from source: play vars 18699 1726882340.58167: variable 'interface' from source: set_fact 18699 1726882340.58244: variable 'interface' from source: set_fact 18699 1726882340.58248: variable 'interface' from source: set_fact 18699 1726882340.58304: variable 'interface' from source: set_fact 18699 1726882340.58499: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 18699 1726882340.58503: when evaluation is False, skipping this task 18699 1726882340.58505: _execute() done 18699 1726882340.58507: dumping result to json 18699 1726882340.58510: done dumping result, returning 18699 1726882340.58512: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12673a56-9f93-1ce6-d207-000000000024] 18699 1726882340.58520: sending task result for task 12673a56-9f93-1ce6-d207-000000000024 18699 1726882340.58577: done sending task result for task 12673a56-9f93-1ce6-d207-000000000024 18699 1726882340.58580: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 18699 1726882340.58735: no more pending results, returning what we have 18699 1726882340.58738: results queue empty 18699 1726882340.58739: checking for any_errors_fatal 18699 1726882340.58744: done checking for any_errors_fatal 18699 1726882340.58744: checking for max_fail_percentage 18699 1726882340.58746: done checking for max_fail_percentage 18699 1726882340.58747: checking to see if all hosts have failed and the running result is not ok 18699 1726882340.58747: done checking to see if all hosts have failed 18699 1726882340.58748: getting the remaining hosts for this loop 18699 1726882340.58749: done getting the remaining hosts for this loop 18699 1726882340.58752: getting the next task for host managed_node1 18699 1726882340.58757: done getting next task for host managed_node1 18699 1726882340.58761: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 18699 1726882340.58763: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882340.58774: getting variables 18699 1726882340.58776: in VariableManager get_vars() 18699 1726882340.58817: Calling all_inventory to load vars for managed_node1 18699 1726882340.58819: Calling groups_inventory to load vars for managed_node1 18699 1726882340.58822: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882340.58830: Calling all_plugins_play to load vars for managed_node1 18699 1726882340.58832: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882340.58834: Calling groups_plugins_play to load vars for managed_node1 18699 1726882340.60851: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882340.63402: done with get_vars() 18699 1726882340.63431: done getting variables 18699 1726882340.63513: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:32:20 -0400 (0:00:00.100) 0:00:14.231 ****** 18699 1726882340.63548: entering _queue_task() for managed_node1/service 18699 1726882340.64130: worker is 1 (out of 1 available) 18699 1726882340.64211: exiting _queue_task() for managed_node1/service 18699 1726882340.64221: done queuing things up, now waiting for results queue to drain 18699 1726882340.64222: waiting for pending results... 18699 1726882340.64535: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 18699 1726882340.64926: in run() - task 12673a56-9f93-1ce6-d207-000000000025 18699 1726882340.64929: variable 'ansible_search_path' from source: unknown 18699 1726882340.64932: variable 'ansible_search_path' from source: unknown 18699 1726882340.64935: calling self._execute() 18699 1726882340.64937: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882340.64940: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882340.64942: variable 'omit' from source: magic vars 18699 1726882340.65576: variable 'ansible_distribution_major_version' from source: facts 18699 1726882340.65601: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882340.66101: variable 'network_provider' from source: set_fact 18699 1726882340.66221: variable 'network_state' from source: role '' defaults 18699 1726882340.66264: Evaluated conditional (network_provider == "nm" or network_state != {}): True 18699 1726882340.66329: variable 'omit' from source: magic vars 18699 1726882340.66372: variable 'omit' from source: magic vars 18699 1726882340.66466: variable 'network_service_name' from source: role '' defaults 18699 1726882340.66632: variable 'network_service_name' from source: role '' defaults 18699 1726882340.66868: variable '__network_provider_setup' from source: role '' defaults 18699 1726882340.66880: variable '__network_service_name_default_nm' from source: role '' defaults 18699 1726882340.66952: variable '__network_service_name_default_nm' from source: role '' defaults 18699 1726882340.66972: variable '__network_packages_default_nm' from source: role '' defaults 18699 1726882340.67038: variable '__network_packages_default_nm' from source: role '' defaults 18699 1726882340.67295: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18699 1726882340.69625: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18699 1726882340.69795: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18699 1726882340.69799: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18699 1726882340.69802: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18699 1726882340.69826: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18699 1726882340.69916: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882340.69950: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882340.69978: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882340.70035: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882340.70053: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882340.70102: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882340.70139: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882340.70166: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882340.70210: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882340.70251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882340.70518: variable '__network_packages_default_gobject_packages' from source: role '' defaults 18699 1726882340.70664: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882340.70773: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882340.70776: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882340.70779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882340.70781: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882340.70853: variable 'ansible_python' from source: facts 18699 1726882340.70885: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 18699 1726882340.71302: variable '__network_wpa_supplicant_required' from source: role '' defaults 18699 1726882340.71305: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 18699 1726882340.71464: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882340.71635: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882340.71663: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882340.71706: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882340.71751: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882340.71891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882340.71929: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882340.72172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882340.72175: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882340.72178: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882340.72376: variable 'network_connections' from source: play vars 18699 1726882340.72509: variable 'interface' from source: set_fact 18699 1726882340.72583: variable 'interface' from source: set_fact 18699 1726882340.72600: variable 'interface' from source: set_fact 18699 1726882340.72769: variable 'interface' from source: set_fact 18699 1726882340.72974: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18699 1726882340.73207: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18699 1726882340.73275: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18699 1726882340.73326: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18699 1726882340.73383: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18699 1726882340.73449: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18699 1726882340.73498: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18699 1726882340.73537: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882340.73574: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18699 1726882340.73639: variable '__network_wireless_connections_defined' from source: role '' defaults 18699 1726882340.73955: variable 'network_connections' from source: play vars 18699 1726882340.73967: variable 'interface' from source: set_fact 18699 1726882340.74055: variable 'interface' from source: set_fact 18699 1726882340.74071: variable 'interface' from source: set_fact 18699 1726882340.74158: variable 'interface' from source: set_fact 18699 1726882340.74215: variable '__network_packages_default_wireless' from source: role '' defaults 18699 1726882340.74312: variable '__network_wireless_connections_defined' from source: role '' defaults 18699 1726882340.74631: variable 'network_connections' from source: play vars 18699 1726882340.74690: variable 'interface' from source: set_fact 18699 1726882340.74724: variable 'interface' from source: set_fact 18699 1726882340.74735: variable 'interface' from source: set_fact 18699 1726882340.74811: variable 'interface' from source: set_fact 18699 1726882340.74841: variable '__network_packages_default_team' from source: role '' defaults 18699 1726882340.74934: variable '__network_team_connections_defined' from source: role '' defaults 18699 1726882340.75225: variable 'network_connections' from source: play vars 18699 1726882340.75306: variable 'interface' from source: set_fact 18699 1726882340.75317: variable 'interface' from source: set_fact 18699 1726882340.75336: variable 'interface' from source: set_fact 18699 1726882340.75411: variable 'interface' from source: set_fact 18699 1726882340.75480: variable '__network_service_name_default_initscripts' from source: role '' defaults 18699 1726882340.75546: variable '__network_service_name_default_initscripts' from source: role '' defaults 18699 1726882340.75565: variable '__network_packages_default_initscripts' from source: role '' defaults 18699 1726882340.75627: variable '__network_packages_default_initscripts' from source: role '' defaults 18699 1726882340.75849: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 18699 1726882340.76405: variable 'network_connections' from source: play vars 18699 1726882340.76416: variable 'interface' from source: set_fact 18699 1726882340.76483: variable 'interface' from source: set_fact 18699 1726882340.76508: variable 'interface' from source: set_fact 18699 1726882340.76599: variable 'interface' from source: set_fact 18699 1726882340.76602: variable 'ansible_distribution' from source: facts 18699 1726882340.76604: variable '__network_rh_distros' from source: role '' defaults 18699 1726882340.76606: variable 'ansible_distribution_major_version' from source: facts 18699 1726882340.76621: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 18699 1726882340.76792: variable 'ansible_distribution' from source: facts 18699 1726882340.76805: variable '__network_rh_distros' from source: role '' defaults 18699 1726882340.76815: variable 'ansible_distribution_major_version' from source: facts 18699 1726882340.76863: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 18699 1726882340.77023: variable 'ansible_distribution' from source: facts 18699 1726882340.77032: variable '__network_rh_distros' from source: role '' defaults 18699 1726882340.77043: variable 'ansible_distribution_major_version' from source: facts 18699 1726882340.77095: variable 'network_provider' from source: set_fact 18699 1726882340.77122: variable 'omit' from source: magic vars 18699 1726882340.77152: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18699 1726882340.77190: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18699 1726882340.77217: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18699 1726882340.77302: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882340.77306: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882340.77308: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18699 1726882340.77311: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882340.77313: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882340.77407: Set connection var ansible_connection to ssh 18699 1726882340.77423: Set connection var ansible_pipelining to False 18699 1726882340.77434: Set connection var ansible_shell_executable to /bin/sh 18699 1726882340.77545: Set connection var ansible_timeout to 10 18699 1726882340.77548: Set connection var ansible_shell_type to sh 18699 1726882340.77550: Set connection var ansible_module_compression to ZIP_DEFLATED 18699 1726882340.77551: variable 'ansible_shell_executable' from source: unknown 18699 1726882340.77553: variable 'ansible_connection' from source: unknown 18699 1726882340.77555: variable 'ansible_module_compression' from source: unknown 18699 1726882340.77556: variable 'ansible_shell_type' from source: unknown 18699 1726882340.77558: variable 'ansible_shell_executable' from source: unknown 18699 1726882340.77560: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882340.77567: variable 'ansible_pipelining' from source: unknown 18699 1726882340.77568: variable 'ansible_timeout' from source: unknown 18699 1726882340.77570: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882340.77733: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18699 1726882340.77736: variable 'omit' from source: magic vars 18699 1726882340.77738: starting attempt loop 18699 1726882340.77740: running the handler 18699 1726882340.77784: variable 'ansible_facts' from source: unknown 18699 1726882340.79604: _low_level_execute_command(): starting 18699 1726882340.79607: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18699 1726882340.80858: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 18699 1726882340.80872: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882340.81023: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882340.81035: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882340.81095: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882340.82890: stdout chunk (state=3): >>>/root <<< 18699 1726882340.83229: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882340.83232: stdout chunk (state=3): >>><<< 18699 1726882340.83234: stderr chunk (state=3): >>><<< 18699 1726882340.83303: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882340.83306: _low_level_execute_command(): starting 18699 1726882340.83310: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882340.8325264-19405-109637420868741 `" && echo ansible-tmp-1726882340.8325264-19405-109637420868741="` echo /root/.ansible/tmp/ansible-tmp-1726882340.8325264-19405-109637420868741 `" ) && sleep 0' 18699 1726882340.84330: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882340.84342: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882340.84353: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882340.84720: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882340.84733: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882340.84814: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882340.86677: stdout chunk (state=3): >>>ansible-tmp-1726882340.8325264-19405-109637420868741=/root/.ansible/tmp/ansible-tmp-1726882340.8325264-19405-109637420868741 <<< 18699 1726882340.86772: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882340.86891: stderr chunk (state=3): >>><<< 18699 1726882340.86895: stdout chunk (state=3): >>><<< 18699 1726882340.86898: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882340.8325264-19405-109637420868741=/root/.ansible/tmp/ansible-tmp-1726882340.8325264-19405-109637420868741 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882340.86901: variable 'ansible_module_compression' from source: unknown 18699 1726882340.87024: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 18699 1726882340.87033: ANSIBALLZ: Acquiring lock 18699 1726882340.87317: ANSIBALLZ: Lock acquired: 140254445799856 18699 1726882340.87319: ANSIBALLZ: Creating module 18699 1726882341.36960: ANSIBALLZ: Writing module into payload 18699 1726882341.37144: ANSIBALLZ: Writing module 18699 1726882341.37179: ANSIBALLZ: Renaming module 18699 1726882341.37192: ANSIBALLZ: Done creating module 18699 1726882341.37244: variable 'ansible_facts' from source: unknown 18699 1726882341.37486: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882340.8325264-19405-109637420868741/AnsiballZ_systemd.py 18699 1726882341.37720: Sending initial data 18699 1726882341.37723: Sent initial data (156 bytes) 18699 1726882341.38370: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882341.38389: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882341.38566: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882341.40217: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18699 1726882341.40257: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18699 1726882341.40306: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmprufckth_ /root/.ansible/tmp/ansible-tmp-1726882340.8325264-19405-109637420868741/AnsiballZ_systemd.py <<< 18699 1726882341.40309: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882340.8325264-19405-109637420868741/AnsiballZ_systemd.py" <<< 18699 1726882341.40358: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmprufckth_" to remote "/root/.ansible/tmp/ansible-tmp-1726882340.8325264-19405-109637420868741/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882340.8325264-19405-109637420868741/AnsiballZ_systemd.py" <<< 18699 1726882341.43284: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882341.43290: stdout chunk (state=3): >>><<< 18699 1726882341.43301: stderr chunk (state=3): >>><<< 18699 1726882341.43359: done transferring module to remote 18699 1726882341.43369: _low_level_execute_command(): starting 18699 1726882341.43376: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882340.8325264-19405-109637420868741/ /root/.ansible/tmp/ansible-tmp-1726882340.8325264-19405-109637420868741/AnsiballZ_systemd.py && sleep 0' 18699 1726882341.44820: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882341.44825: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 18699 1726882341.44854: stderr chunk (state=3): >>>debug2: match not found <<< 18699 1726882341.44858: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 18699 1726882341.45001: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882341.45004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882341.45307: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882341.47162: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882341.47166: stderr chunk (state=3): >>><<< 18699 1726882341.47168: stdout chunk (state=3): >>><<< 18699 1726882341.47197: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882341.47200: _low_level_execute_command(): starting 18699 1726882341.47203: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882340.8325264-19405-109637420868741/AnsiballZ_systemd.py && sleep 0' 18699 1726882341.48491: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882341.48591: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882341.48674: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882341.48818: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882341.77553: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "701", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainStartTimestampMonotonic": "18353430", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainHandoffTimestampMonotonic": "18368765", "ExecMainPID": "701", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10772480", "MemoryPeak": "14331904", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3312652288", "EffectiveMemoryMax": "3702886400", "EffectiveMemoryHigh": "3702886400", "CPUUsageNSec": "1245170000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target NetworkManager-wait-online.service network.target cloud-init.service shutdown.target", "After": "basic.target system.slice sysinit.target systemd-journald.socket network-pre.target dbus-broker.service dbus.socket cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:37 EDT", "StateChangeTimestampMonotonic": "610814281", "InactiveExitTimestamp": "Fri 2024-09-20 21:19:45 EDT", "InactiveExitTimestampMonotonic": "18353817", "ActiveEnterTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ActiveEnterTimestampMonotonic": "18664782", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ConditionTimestampMonotonic": "18352589", "AssertTimestamp": "Fri 2024-09-20 21:19:45 EDT", "AssertTimestampMonotonic": "18352592", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ccc4619c603e4305b3d5044f460b1d5b", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 18699 1726882341.79236: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 18699 1726882341.79404: stderr chunk (state=3): >>><<< 18699 1726882341.79408: stdout chunk (state=3): >>><<< 18699 1726882341.79411: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "701", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainStartTimestampMonotonic": "18353430", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainHandoffTimestampMonotonic": "18368765", "ExecMainPID": "701", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10772480", "MemoryPeak": "14331904", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3312652288", "EffectiveMemoryMax": "3702886400", "EffectiveMemoryHigh": "3702886400", "CPUUsageNSec": "1245170000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target NetworkManager-wait-online.service network.target cloud-init.service shutdown.target", "After": "basic.target system.slice sysinit.target systemd-journald.socket network-pre.target dbus-broker.service dbus.socket cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:37 EDT", "StateChangeTimestampMonotonic": "610814281", "InactiveExitTimestamp": "Fri 2024-09-20 21:19:45 EDT", "InactiveExitTimestampMonotonic": "18353817", "ActiveEnterTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ActiveEnterTimestampMonotonic": "18664782", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ConditionTimestampMonotonic": "18352589", "AssertTimestamp": "Fri 2024-09-20 21:19:45 EDT", "AssertTimestampMonotonic": "18352592", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ccc4619c603e4305b3d5044f460b1d5b", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 18699 1726882341.79740: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882340.8325264-19405-109637420868741/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18699 1726882341.79901: _low_level_execute_command(): starting 18699 1726882341.79905: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882340.8325264-19405-109637420868741/ > /dev/null 2>&1 && sleep 0' 18699 1726882341.81128: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882341.81144: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882341.81662: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882341.81665: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882341.83543: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882341.83579: stderr chunk (state=3): >>><<< 18699 1726882341.84003: stdout chunk (state=3): >>><<< 18699 1726882341.84007: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882341.84010: handler run complete 18699 1726882341.84012: attempt loop complete, returning result 18699 1726882341.84014: _execute() done 18699 1726882341.84016: dumping result to json 18699 1726882341.84018: done dumping result, returning 18699 1726882341.84020: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12673a56-9f93-1ce6-d207-000000000025] 18699 1726882341.84023: sending task result for task 12673a56-9f93-1ce6-d207-000000000025 ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18699 1726882341.84372: no more pending results, returning what we have 18699 1726882341.84375: results queue empty 18699 1726882341.84377: checking for any_errors_fatal 18699 1726882341.84383: done checking for any_errors_fatal 18699 1726882341.84383: checking for max_fail_percentage 18699 1726882341.84386: done checking for max_fail_percentage 18699 1726882341.84386: checking to see if all hosts have failed and the running result is not ok 18699 1726882341.84387: done checking to see if all hosts have failed 18699 1726882341.84388: getting the remaining hosts for this loop 18699 1726882341.84389: done getting the remaining hosts for this loop 18699 1726882341.84395: getting the next task for host managed_node1 18699 1726882341.84406: done getting next task for host managed_node1 18699 1726882341.84411: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 18699 1726882341.84412: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882341.84423: getting variables 18699 1726882341.84425: in VariableManager get_vars() 18699 1726882341.84456: Calling all_inventory to load vars for managed_node1 18699 1726882341.84459: Calling groups_inventory to load vars for managed_node1 18699 1726882341.84461: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882341.84470: Calling all_plugins_play to load vars for managed_node1 18699 1726882341.84472: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882341.84475: Calling groups_plugins_play to load vars for managed_node1 18699 1726882341.85256: done sending task result for task 12673a56-9f93-1ce6-d207-000000000025 18699 1726882341.85260: WORKER PROCESS EXITING 18699 1726882341.87169: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882341.90483: done with get_vars() 18699 1726882341.90515: done getting variables 18699 1726882341.90577: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:32:21 -0400 (0:00:01.271) 0:00:15.503 ****** 18699 1726882341.90725: entering _queue_task() for managed_node1/service 18699 1726882341.91390: worker is 1 (out of 1 available) 18699 1726882341.91470: exiting _queue_task() for managed_node1/service 18699 1726882341.91483: done queuing things up, now waiting for results queue to drain 18699 1726882341.91484: waiting for pending results... 18699 1726882341.91797: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 18699 1726882341.92001: in run() - task 12673a56-9f93-1ce6-d207-000000000026 18699 1726882341.92055: variable 'ansible_search_path' from source: unknown 18699 1726882341.92256: variable 'ansible_search_path' from source: unknown 18699 1726882341.92261: calling self._execute() 18699 1726882341.92391: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882341.92410: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882341.92427: variable 'omit' from source: magic vars 18699 1726882341.93218: variable 'ansible_distribution_major_version' from source: facts 18699 1726882341.93242: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882341.93479: variable 'network_provider' from source: set_fact 18699 1726882341.93491: Evaluated conditional (network_provider == "nm"): True 18699 1726882341.93653: variable '__network_wpa_supplicant_required' from source: role '' defaults 18699 1726882341.93862: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 18699 1726882341.94211: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18699 1726882341.99018: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18699 1726882341.99101: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18699 1726882341.99143: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18699 1726882341.99207: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18699 1726882341.99240: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18699 1726882341.99335: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882341.99371: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882341.99417: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882341.99464: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882341.99499: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882341.99605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882341.99609: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882341.99615: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882341.99663: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882341.99683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882341.99744: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882341.99820: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882341.99823: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882341.99858: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882341.99877: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882342.00048: variable 'network_connections' from source: play vars 18699 1726882342.00069: variable 'interface' from source: set_fact 18699 1726882342.00166: variable 'interface' from source: set_fact 18699 1726882342.00200: variable 'interface' from source: set_fact 18699 1726882342.00245: variable 'interface' from source: set_fact 18699 1726882342.00339: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18699 1726882342.00550: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18699 1726882342.00801: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18699 1726882342.00805: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18699 1726882342.00808: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18699 1726882342.00811: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18699 1726882342.00813: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18699 1726882342.00814: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882342.00816: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18699 1726882342.00849: variable '__network_wireless_connections_defined' from source: role '' defaults 18699 1726882342.01106: variable 'network_connections' from source: play vars 18699 1726882342.01117: variable 'interface' from source: set_fact 18699 1726882342.01189: variable 'interface' from source: set_fact 18699 1726882342.01209: variable 'interface' from source: set_fact 18699 1726882342.01280: variable 'interface' from source: set_fact 18699 1726882342.01328: Evaluated conditional (__network_wpa_supplicant_required): False 18699 1726882342.01335: when evaluation is False, skipping this task 18699 1726882342.01342: _execute() done 18699 1726882342.01357: dumping result to json 18699 1726882342.01375: done dumping result, returning 18699 1726882342.01388: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12673a56-9f93-1ce6-d207-000000000026] 18699 1726882342.01403: sending task result for task 12673a56-9f93-1ce6-d207-000000000026 skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 18699 1726882342.01671: no more pending results, returning what we have 18699 1726882342.01676: results queue empty 18699 1726882342.01677: checking for any_errors_fatal 18699 1726882342.01700: done checking for any_errors_fatal 18699 1726882342.01702: checking for max_fail_percentage 18699 1726882342.01704: done checking for max_fail_percentage 18699 1726882342.01705: checking to see if all hosts have failed and the running result is not ok 18699 1726882342.01705: done checking to see if all hosts have failed 18699 1726882342.01706: getting the remaining hosts for this loop 18699 1726882342.01708: done getting the remaining hosts for this loop 18699 1726882342.01712: getting the next task for host managed_node1 18699 1726882342.01718: done getting next task for host managed_node1 18699 1726882342.01722: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 18699 1726882342.01724: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882342.01745: getting variables 18699 1726882342.01748: in VariableManager get_vars() 18699 1726882342.01785: Calling all_inventory to load vars for managed_node1 18699 1726882342.01787: Calling groups_inventory to load vars for managed_node1 18699 1726882342.01790: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882342.01849: done sending task result for task 12673a56-9f93-1ce6-d207-000000000026 18699 1726882342.01852: WORKER PROCESS EXITING 18699 1726882342.01907: Calling all_plugins_play to load vars for managed_node1 18699 1726882342.01912: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882342.01915: Calling groups_plugins_play to load vars for managed_node1 18699 1726882342.03791: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882342.05478: done with get_vars() 18699 1726882342.05503: done getting variables 18699 1726882342.05571: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:32:22 -0400 (0:00:00.148) 0:00:15.652 ****** 18699 1726882342.05602: entering _queue_task() for managed_node1/service 18699 1726882342.06045: worker is 1 (out of 1 available) 18699 1726882342.06057: exiting _queue_task() for managed_node1/service 18699 1726882342.06069: done queuing things up, now waiting for results queue to drain 18699 1726882342.06070: waiting for pending results... 18699 1726882342.06547: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 18699 1726882342.06748: in run() - task 12673a56-9f93-1ce6-d207-000000000027 18699 1726882342.06768: variable 'ansible_search_path' from source: unknown 18699 1726882342.06901: variable 'ansible_search_path' from source: unknown 18699 1726882342.06905: calling self._execute() 18699 1726882342.07166: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882342.07170: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882342.07173: variable 'omit' from source: magic vars 18699 1726882342.07961: variable 'ansible_distribution_major_version' from source: facts 18699 1726882342.08055: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882342.08371: variable 'network_provider' from source: set_fact 18699 1726882342.08383: Evaluated conditional (network_provider == "initscripts"): False 18699 1726882342.08390: when evaluation is False, skipping this task 18699 1726882342.08403: _execute() done 18699 1726882342.08414: dumping result to json 18699 1726882342.08424: done dumping result, returning 18699 1726882342.08436: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [12673a56-9f93-1ce6-d207-000000000027] 18699 1726882342.08485: sending task result for task 12673a56-9f93-1ce6-d207-000000000027 18699 1726882342.08765: done sending task result for task 12673a56-9f93-1ce6-d207-000000000027 18699 1726882342.08769: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18699 1726882342.08850: no more pending results, returning what we have 18699 1726882342.08854: results queue empty 18699 1726882342.08855: checking for any_errors_fatal 18699 1726882342.08862: done checking for any_errors_fatal 18699 1726882342.08863: checking for max_fail_percentage 18699 1726882342.08865: done checking for max_fail_percentage 18699 1726882342.08866: checking to see if all hosts have failed and the running result is not ok 18699 1726882342.08866: done checking to see if all hosts have failed 18699 1726882342.08867: getting the remaining hosts for this loop 18699 1726882342.08868: done getting the remaining hosts for this loop 18699 1726882342.08872: getting the next task for host managed_node1 18699 1726882342.08882: done getting next task for host managed_node1 18699 1726882342.08886: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 18699 1726882342.08890: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882342.08914: getting variables 18699 1726882342.08917: in VariableManager get_vars() 18699 1726882342.08956: Calling all_inventory to load vars for managed_node1 18699 1726882342.08960: Calling groups_inventory to load vars for managed_node1 18699 1726882342.08962: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882342.08975: Calling all_plugins_play to load vars for managed_node1 18699 1726882342.08978: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882342.08981: Calling groups_plugins_play to load vars for managed_node1 18699 1726882342.10900: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882342.12920: done with get_vars() 18699 1726882342.12942: done getting variables 18699 1726882342.13060: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:32:22 -0400 (0:00:00.074) 0:00:15.726 ****** 18699 1726882342.13090: entering _queue_task() for managed_node1/copy 18699 1726882342.13603: worker is 1 (out of 1 available) 18699 1726882342.13615: exiting _queue_task() for managed_node1/copy 18699 1726882342.13627: done queuing things up, now waiting for results queue to drain 18699 1726882342.13629: waiting for pending results... 18699 1726882342.14122: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 18699 1726882342.14127: in run() - task 12673a56-9f93-1ce6-d207-000000000028 18699 1726882342.14130: variable 'ansible_search_path' from source: unknown 18699 1726882342.14133: variable 'ansible_search_path' from source: unknown 18699 1726882342.14136: calling self._execute() 18699 1726882342.14138: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882342.14141: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882342.14143: variable 'omit' from source: magic vars 18699 1726882342.14496: variable 'ansible_distribution_major_version' from source: facts 18699 1726882342.14509: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882342.14632: variable 'network_provider' from source: set_fact 18699 1726882342.14648: Evaluated conditional (network_provider == "initscripts"): False 18699 1726882342.14651: when evaluation is False, skipping this task 18699 1726882342.14654: _execute() done 18699 1726882342.14656: dumping result to json 18699 1726882342.14658: done dumping result, returning 18699 1726882342.14664: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12673a56-9f93-1ce6-d207-000000000028] 18699 1726882342.14669: sending task result for task 12673a56-9f93-1ce6-d207-000000000028 18699 1726882342.14875: done sending task result for task 12673a56-9f93-1ce6-d207-000000000028 18699 1726882342.14878: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 18699 1726882342.14924: no more pending results, returning what we have 18699 1726882342.14927: results queue empty 18699 1726882342.14929: checking for any_errors_fatal 18699 1726882342.14934: done checking for any_errors_fatal 18699 1726882342.14934: checking for max_fail_percentage 18699 1726882342.14937: done checking for max_fail_percentage 18699 1726882342.14937: checking to see if all hosts have failed and the running result is not ok 18699 1726882342.14938: done checking to see if all hosts have failed 18699 1726882342.14939: getting the remaining hosts for this loop 18699 1726882342.14940: done getting the remaining hosts for this loop 18699 1726882342.14943: getting the next task for host managed_node1 18699 1726882342.14949: done getting next task for host managed_node1 18699 1726882342.14952: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 18699 1726882342.14955: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882342.14972: getting variables 18699 1726882342.14974: in VariableManager get_vars() 18699 1726882342.15019: Calling all_inventory to load vars for managed_node1 18699 1726882342.15023: Calling groups_inventory to load vars for managed_node1 18699 1726882342.15025: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882342.15036: Calling all_plugins_play to load vars for managed_node1 18699 1726882342.15039: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882342.15042: Calling groups_plugins_play to load vars for managed_node1 18699 1726882342.17773: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882342.21065: done with get_vars() 18699 1726882342.21223: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:32:22 -0400 (0:00:00.083) 0:00:15.810 ****** 18699 1726882342.21410: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 18699 1726882342.21412: Creating lock for fedora.linux_system_roles.network_connections 18699 1726882342.22238: worker is 1 (out of 1 available) 18699 1726882342.22250: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 18699 1726882342.22261: done queuing things up, now waiting for results queue to drain 18699 1726882342.22262: waiting for pending results... 18699 1726882342.22712: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 18699 1726882342.22901: in run() - task 12673a56-9f93-1ce6-d207-000000000029 18699 1726882342.22924: variable 'ansible_search_path' from source: unknown 18699 1726882342.22932: variable 'ansible_search_path' from source: unknown 18699 1726882342.22978: calling self._execute() 18699 1726882342.23210: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882342.23492: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882342.23503: variable 'omit' from source: magic vars 18699 1726882342.24191: variable 'ansible_distribution_major_version' from source: facts 18699 1726882342.24219: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882342.24232: variable 'omit' from source: magic vars 18699 1726882342.24281: variable 'omit' from source: magic vars 18699 1726882342.24656: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18699 1726882342.30203: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18699 1726882342.30209: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18699 1726882342.30211: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18699 1726882342.30213: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18699 1726882342.30801: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18699 1726882342.30805: variable 'network_provider' from source: set_fact 18699 1726882342.30961: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882342.31084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882342.31179: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882342.31276: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882342.31337: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882342.31590: variable 'omit' from source: magic vars 18699 1726882342.31839: variable 'omit' from source: magic vars 18699 1726882342.32112: variable 'network_connections' from source: play vars 18699 1726882342.32129: variable 'interface' from source: set_fact 18699 1726882342.32197: variable 'interface' from source: set_fact 18699 1726882342.32327: variable 'interface' from source: set_fact 18699 1726882342.32388: variable 'interface' from source: set_fact 18699 1726882342.32571: variable 'omit' from source: magic vars 18699 1726882342.32718: variable '__lsr_ansible_managed' from source: task vars 18699 1726882342.32802: variable '__lsr_ansible_managed' from source: task vars 18699 1726882342.34118: Loaded config def from plugin (lookup/template) 18699 1726882342.34128: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 18699 1726882342.34161: File lookup term: get_ansible_managed.j2 18699 1726882342.34169: variable 'ansible_search_path' from source: unknown 18699 1726882342.34207: evaluation_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 18699 1726882342.34230: search_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 18699 1726882342.34399: variable 'ansible_search_path' from source: unknown 18699 1726882342.47129: variable 'ansible_managed' from source: unknown 18699 1726882342.47448: variable 'omit' from source: magic vars 18699 1726882342.47524: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18699 1726882342.47558: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18699 1726882342.47618: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18699 1726882342.47707: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882342.47723: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882342.47756: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18699 1726882342.47842: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882342.47853: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882342.48005: Set connection var ansible_connection to ssh 18699 1726882342.48198: Set connection var ansible_pipelining to False 18699 1726882342.48201: Set connection var ansible_shell_executable to /bin/sh 18699 1726882342.48204: Set connection var ansible_timeout to 10 18699 1726882342.48206: Set connection var ansible_shell_type to sh 18699 1726882342.48208: Set connection var ansible_module_compression to ZIP_DEFLATED 18699 1726882342.48210: variable 'ansible_shell_executable' from source: unknown 18699 1726882342.48213: variable 'ansible_connection' from source: unknown 18699 1726882342.48215: variable 'ansible_module_compression' from source: unknown 18699 1726882342.48221: variable 'ansible_shell_type' from source: unknown 18699 1726882342.48228: variable 'ansible_shell_executable' from source: unknown 18699 1726882342.48240: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882342.48250: variable 'ansible_pipelining' from source: unknown 18699 1726882342.48352: variable 'ansible_timeout' from source: unknown 18699 1726882342.48356: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882342.49160: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18699 1726882342.49171: variable 'omit' from source: magic vars 18699 1726882342.49174: starting attempt loop 18699 1726882342.49176: running the handler 18699 1726882342.49179: _low_level_execute_command(): starting 18699 1726882342.49181: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18699 1726882342.51049: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882342.51052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882342.51055: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882342.51057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882342.51215: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882342.51238: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882342.51359: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882342.51406: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882342.53070: stdout chunk (state=3): >>>/root <<< 18699 1726882342.53168: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882342.53322: stderr chunk (state=3): >>><<< 18699 1726882342.53357: stdout chunk (state=3): >>><<< 18699 1726882342.53608: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882342.53613: _low_level_execute_command(): starting 18699 1726882342.53617: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882342.5351703-19488-272939447643558 `" && echo ansible-tmp-1726882342.5351703-19488-272939447643558="` echo /root/.ansible/tmp/ansible-tmp-1726882342.5351703-19488-272939447643558 `" ) && sleep 0' 18699 1726882342.55309: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882342.55446: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882342.55733: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882342.55767: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882342.57665: stdout chunk (state=3): >>>ansible-tmp-1726882342.5351703-19488-272939447643558=/root/.ansible/tmp/ansible-tmp-1726882342.5351703-19488-272939447643558 <<< 18699 1726882342.57824: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882342.57830: stdout chunk (state=3): >>><<< 18699 1726882342.57834: stderr chunk (state=3): >>><<< 18699 1726882342.57950: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882342.5351703-19488-272939447643558=/root/.ansible/tmp/ansible-tmp-1726882342.5351703-19488-272939447643558 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882342.57953: variable 'ansible_module_compression' from source: unknown 18699 1726882342.58061: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 18699 1726882342.58069: ANSIBALLZ: Acquiring lock 18699 1726882342.58076: ANSIBALLZ: Lock acquired: 140254443802640 18699 1726882342.58084: ANSIBALLZ: Creating module 18699 1726882343.04326: ANSIBALLZ: Writing module into payload 18699 1726882343.05063: ANSIBALLZ: Writing module 18699 1726882343.05299: ANSIBALLZ: Renaming module 18699 1726882343.05303: ANSIBALLZ: Done creating module 18699 1726882343.05305: variable 'ansible_facts' from source: unknown 18699 1726882343.05307: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882342.5351703-19488-272939447643558/AnsiballZ_network_connections.py 18699 1726882343.05918: Sending initial data 18699 1726882343.05928: Sent initial data (168 bytes) 18699 1726882343.07010: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18699 1726882343.07092: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882343.07213: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882343.07297: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882343.09213: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18699 1726882343.09252: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18699 1726882343.09824: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmpxkr6fg7h /root/.ansible/tmp/ansible-tmp-1726882342.5351703-19488-272939447643558/AnsiballZ_network_connections.py <<< 18699 1726882343.09828: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882342.5351703-19488-272939447643558/AnsiballZ_network_connections.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmpxkr6fg7h" to remote "/root/.ansible/tmp/ansible-tmp-1726882342.5351703-19488-272939447643558/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882342.5351703-19488-272939447643558/AnsiballZ_network_connections.py" <<< 18699 1726882343.11735: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882343.11902: stderr chunk (state=3): >>><<< 18699 1726882343.11906: stdout chunk (state=3): >>><<< 18699 1726882343.11908: done transferring module to remote 18699 1726882343.11911: _low_level_execute_command(): starting 18699 1726882343.11914: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882342.5351703-19488-272939447643558/ /root/.ansible/tmp/ansible-tmp-1726882342.5351703-19488-272939447643558/AnsiballZ_network_connections.py && sleep 0' 18699 1726882343.13251: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882343.13265: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 18699 1726882343.13277: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882343.13644: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882343.13647: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882343.14009: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882343.15526: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882343.15557: stderr chunk (state=3): >>><<< 18699 1726882343.15567: stdout chunk (state=3): >>><<< 18699 1726882343.15589: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882343.15601: _low_level_execute_command(): starting 18699 1726882343.15611: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882342.5351703-19488-272939447643558/AnsiballZ_network_connections.py && sleep 0' 18699 1726882343.16764: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882343.16777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882343.16788: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882343.16960: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882343.17049: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882343.17060: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882343.17128: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882343.59500: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'lsr27': add connection lsr27, a5a9140a-b936-48d0-9f96-c02df457936c\n[004] #0, state:up persistent_state:present, 'lsr27': up connection lsr27, a5a9140a-b936-48d0-9f96-c02df457936c (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "interface_name": "lsr27", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"address": "192.0.2.1/24"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "interface_name": "lsr27", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"address": "192.0.2.1/24"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 18699 1726882343.61433: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882343.61480: stderr chunk (state=3): >>>Shared connection to 10.31.9.159 closed. <<< 18699 1726882343.61538: stderr chunk (state=3): >>><<< 18699 1726882343.61542: stdout chunk (state=3): >>><<< 18699 1726882343.61597: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'lsr27': add connection lsr27, a5a9140a-b936-48d0-9f96-c02df457936c\n[004] #0, state:up persistent_state:present, 'lsr27': up connection lsr27, a5a9140a-b936-48d0-9f96-c02df457936c (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "interface_name": "lsr27", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"address": "192.0.2.1/24"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "interface_name": "lsr27", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"address": "192.0.2.1/24"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 18699 1726882343.61640: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'lsr27', 'interface_name': 'lsr27', 'state': 'up', 'type': 'ethernet', 'autoconnect': True, 'ip': {'address': '192.0.2.1/24'}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882342.5351703-19488-272939447643558/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18699 1726882343.61649: _low_level_execute_command(): starting 18699 1726882343.61654: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882342.5351703-19488-272939447643558/ > /dev/null 2>&1 && sleep 0' 18699 1726882343.62935: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882343.63008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882343.63023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18699 1726882343.63041: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 18699 1726882343.63048: stderr chunk (state=3): >>>debug2: match not found <<< 18699 1726882343.63058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882343.63072: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18699 1726882343.63152: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882343.63283: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882343.63339: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882343.63472: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882343.65601: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882343.65605: stdout chunk (state=3): >>><<< 18699 1726882343.65607: stderr chunk (state=3): >>><<< 18699 1726882343.65610: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882343.65612: handler run complete 18699 1726882343.65614: attempt loop complete, returning result 18699 1726882343.65616: _execute() done 18699 1726882343.65617: dumping result to json 18699 1726882343.65619: done dumping result, returning 18699 1726882343.65622: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12673a56-9f93-1ce6-d207-000000000029] 18699 1726882343.65624: sending task result for task 12673a56-9f93-1ce6-d207-000000000029 changed: [managed_node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": true, "interface_name": "lsr27", "ip": { "address": "192.0.2.1/24" }, "name": "lsr27", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [003] #0, state:up persistent_state:present, 'lsr27': add connection lsr27, a5a9140a-b936-48d0-9f96-c02df457936c [004] #0, state:up persistent_state:present, 'lsr27': up connection lsr27, a5a9140a-b936-48d0-9f96-c02df457936c (not-active) 18699 1726882343.65941: done sending task result for task 12673a56-9f93-1ce6-d207-000000000029 18699 1726882343.66009: no more pending results, returning what we have 18699 1726882343.66301: WORKER PROCESS EXITING 18699 1726882343.66312: results queue empty 18699 1726882343.66361: checking for any_errors_fatal 18699 1726882343.66369: done checking for any_errors_fatal 18699 1726882343.66370: checking for max_fail_percentage 18699 1726882343.66372: done checking for max_fail_percentage 18699 1726882343.66372: checking to see if all hosts have failed and the running result is not ok 18699 1726882343.66373: done checking to see if all hosts have failed 18699 1726882343.66374: getting the remaining hosts for this loop 18699 1726882343.66375: done getting the remaining hosts for this loop 18699 1726882343.66379: getting the next task for host managed_node1 18699 1726882343.66385: done getting next task for host managed_node1 18699 1726882343.66389: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 18699 1726882343.66391: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882343.66497: getting variables 18699 1726882343.66499: in VariableManager get_vars() 18699 1726882343.66539: Calling all_inventory to load vars for managed_node1 18699 1726882343.66542: Calling groups_inventory to load vars for managed_node1 18699 1726882343.66544: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882343.66554: Calling all_plugins_play to load vars for managed_node1 18699 1726882343.66556: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882343.66558: Calling groups_plugins_play to load vars for managed_node1 18699 1726882343.70486: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882343.72131: done with get_vars() 18699 1726882343.72159: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:32:23 -0400 (0:00:01.508) 0:00:17.318 ****** 18699 1726882343.72256: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 18699 1726882343.72258: Creating lock for fedora.linux_system_roles.network_state 18699 1726882343.72676: worker is 1 (out of 1 available) 18699 1726882343.72690: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 18699 1726882343.72706: done queuing things up, now waiting for results queue to drain 18699 1726882343.72707: waiting for pending results... 18699 1726882343.72963: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 18699 1726882343.73065: in run() - task 12673a56-9f93-1ce6-d207-00000000002a 18699 1726882343.73088: variable 'ansible_search_path' from source: unknown 18699 1726882343.73091: variable 'ansible_search_path' from source: unknown 18699 1726882343.73128: calling self._execute() 18699 1726882343.73223: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882343.73227: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882343.73239: variable 'omit' from source: magic vars 18699 1726882343.73621: variable 'ansible_distribution_major_version' from source: facts 18699 1726882343.73642: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882343.73767: variable 'network_state' from source: role '' defaults 18699 1726882343.73776: Evaluated conditional (network_state != {}): False 18699 1726882343.73779: when evaluation is False, skipping this task 18699 1726882343.73782: _execute() done 18699 1726882343.73785: dumping result to json 18699 1726882343.73787: done dumping result, returning 18699 1726882343.74101: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [12673a56-9f93-1ce6-d207-00000000002a] 18699 1726882343.74104: sending task result for task 12673a56-9f93-1ce6-d207-00000000002a 18699 1726882343.74165: done sending task result for task 12673a56-9f93-1ce6-d207-00000000002a 18699 1726882343.74169: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 18699 1726882343.74213: no more pending results, returning what we have 18699 1726882343.74217: results queue empty 18699 1726882343.74218: checking for any_errors_fatal 18699 1726882343.74226: done checking for any_errors_fatal 18699 1726882343.74227: checking for max_fail_percentage 18699 1726882343.74229: done checking for max_fail_percentage 18699 1726882343.74230: checking to see if all hosts have failed and the running result is not ok 18699 1726882343.74230: done checking to see if all hosts have failed 18699 1726882343.74231: getting the remaining hosts for this loop 18699 1726882343.74232: done getting the remaining hosts for this loop 18699 1726882343.74236: getting the next task for host managed_node1 18699 1726882343.74241: done getting next task for host managed_node1 18699 1726882343.74245: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 18699 1726882343.74247: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882343.74260: getting variables 18699 1726882343.74262: in VariableManager get_vars() 18699 1726882343.74299: Calling all_inventory to load vars for managed_node1 18699 1726882343.74302: Calling groups_inventory to load vars for managed_node1 18699 1726882343.74304: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882343.74313: Calling all_plugins_play to load vars for managed_node1 18699 1726882343.74316: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882343.74319: Calling groups_plugins_play to load vars for managed_node1 18699 1726882343.77126: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882343.80969: done with get_vars() 18699 1726882343.81008: done getting variables 18699 1726882343.81068: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:32:23 -0400 (0:00:00.088) 0:00:17.407 ****** 18699 1726882343.81101: entering _queue_task() for managed_node1/debug 18699 1726882343.81734: worker is 1 (out of 1 available) 18699 1726882343.81747: exiting _queue_task() for managed_node1/debug 18699 1726882343.81873: done queuing things up, now waiting for results queue to drain 18699 1726882343.81875: waiting for pending results... 18699 1726882343.82217: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 18699 1726882343.82387: in run() - task 12673a56-9f93-1ce6-d207-00000000002b 18699 1726882343.82511: variable 'ansible_search_path' from source: unknown 18699 1726882343.82514: variable 'ansible_search_path' from source: unknown 18699 1726882343.82556: calling self._execute() 18699 1726882343.82643: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882343.82775: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882343.82785: variable 'omit' from source: magic vars 18699 1726882343.83876: variable 'ansible_distribution_major_version' from source: facts 18699 1726882343.84008: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882343.84019: variable 'omit' from source: magic vars 18699 1726882343.84106: variable 'omit' from source: magic vars 18699 1726882343.84533: variable 'omit' from source: magic vars 18699 1726882343.84573: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18699 1726882343.85016: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18699 1726882343.85019: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18699 1726882343.85022: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882343.85024: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882343.85291: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18699 1726882343.85299: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882343.85301: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882343.85902: Set connection var ansible_connection to ssh 18699 1726882343.85907: Set connection var ansible_pipelining to False 18699 1726882343.85910: Set connection var ansible_shell_executable to /bin/sh 18699 1726882343.85913: Set connection var ansible_timeout to 10 18699 1726882343.85915: Set connection var ansible_shell_type to sh 18699 1726882343.85917: Set connection var ansible_module_compression to ZIP_DEFLATED 18699 1726882343.85920: variable 'ansible_shell_executable' from source: unknown 18699 1726882343.85922: variable 'ansible_connection' from source: unknown 18699 1726882343.85924: variable 'ansible_module_compression' from source: unknown 18699 1726882343.85926: variable 'ansible_shell_type' from source: unknown 18699 1726882343.85929: variable 'ansible_shell_executable' from source: unknown 18699 1726882343.85931: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882343.85933: variable 'ansible_pipelining' from source: unknown 18699 1726882343.85936: variable 'ansible_timeout' from source: unknown 18699 1726882343.85938: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882343.86351: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18699 1726882343.86446: variable 'omit' from source: magic vars 18699 1726882343.86449: starting attempt loop 18699 1726882343.86452: running the handler 18699 1726882343.86820: variable '__network_connections_result' from source: set_fact 18699 1726882343.86863: handler run complete 18699 1726882343.86867: attempt loop complete, returning result 18699 1726882343.86870: _execute() done 18699 1726882343.86872: dumping result to json 18699 1726882343.86875: done dumping result, returning 18699 1726882343.86877: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12673a56-9f93-1ce6-d207-00000000002b] 18699 1726882343.86879: sending task result for task 12673a56-9f93-1ce6-d207-00000000002b 18699 1726882343.87398: done sending task result for task 12673a56-9f93-1ce6-d207-00000000002b ok: [managed_node1] => { "__network_connections_result.stderr_lines": [ "[003] #0, state:up persistent_state:present, 'lsr27': add connection lsr27, a5a9140a-b936-48d0-9f96-c02df457936c", "[004] #0, state:up persistent_state:present, 'lsr27': up connection lsr27, a5a9140a-b936-48d0-9f96-c02df457936c (not-active)" ] } 18699 1726882343.87455: no more pending results, returning what we have 18699 1726882343.87459: results queue empty 18699 1726882343.87460: checking for any_errors_fatal 18699 1726882343.87468: done checking for any_errors_fatal 18699 1726882343.87468: checking for max_fail_percentage 18699 1726882343.87470: done checking for max_fail_percentage 18699 1726882343.87471: checking to see if all hosts have failed and the running result is not ok 18699 1726882343.87472: done checking to see if all hosts have failed 18699 1726882343.87473: getting the remaining hosts for this loop 18699 1726882343.87474: done getting the remaining hosts for this loop 18699 1726882343.87478: getting the next task for host managed_node1 18699 1726882343.87485: done getting next task for host managed_node1 18699 1726882343.87489: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 18699 1726882343.87491: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882343.87505: getting variables 18699 1726882343.87507: in VariableManager get_vars() 18699 1726882343.87547: Calling all_inventory to load vars for managed_node1 18699 1726882343.87550: Calling groups_inventory to load vars for managed_node1 18699 1726882343.87553: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882343.87562: Calling all_plugins_play to load vars for managed_node1 18699 1726882343.87564: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882343.87567: Calling groups_plugins_play to load vars for managed_node1 18699 1726882343.88934: WORKER PROCESS EXITING 18699 1726882343.91854: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882343.94021: done with get_vars() 18699 1726882343.94045: done getting variables 18699 1726882343.94206: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:32:23 -0400 (0:00:00.131) 0:00:17.538 ****** 18699 1726882343.94242: entering _queue_task() for managed_node1/debug 18699 1726882343.94949: worker is 1 (out of 1 available) 18699 1726882343.94961: exiting _queue_task() for managed_node1/debug 18699 1726882343.95198: done queuing things up, now waiting for results queue to drain 18699 1726882343.95200: waiting for pending results... 18699 1726882343.95491: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 18699 1726882343.95706: in run() - task 12673a56-9f93-1ce6-d207-00000000002c 18699 1726882343.95764: variable 'ansible_search_path' from source: unknown 18699 1726882343.95970: variable 'ansible_search_path' from source: unknown 18699 1726882343.95974: calling self._execute() 18699 1726882343.96115: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882343.96126: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882343.96141: variable 'omit' from source: magic vars 18699 1726882343.96572: variable 'ansible_distribution_major_version' from source: facts 18699 1726882343.96589: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882343.96604: variable 'omit' from source: magic vars 18699 1726882343.96654: variable 'omit' from source: magic vars 18699 1726882343.96700: variable 'omit' from source: magic vars 18699 1726882343.96752: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18699 1726882343.96791: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18699 1726882343.96821: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18699 1726882343.96853: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882343.96871: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882343.96911: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18699 1726882343.96920: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882343.96928: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882343.97029: Set connection var ansible_connection to ssh 18699 1726882343.97041: Set connection var ansible_pipelining to False 18699 1726882343.97058: Set connection var ansible_shell_executable to /bin/sh 18699 1726882343.97068: Set connection var ansible_timeout to 10 18699 1726882343.97074: Set connection var ansible_shell_type to sh 18699 1726882343.97083: Set connection var ansible_module_compression to ZIP_DEFLATED 18699 1726882343.97123: variable 'ansible_shell_executable' from source: unknown 18699 1726882343.97132: variable 'ansible_connection' from source: unknown 18699 1726882343.97139: variable 'ansible_module_compression' from source: unknown 18699 1726882343.97146: variable 'ansible_shell_type' from source: unknown 18699 1726882343.97152: variable 'ansible_shell_executable' from source: unknown 18699 1726882343.97164: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882343.97172: variable 'ansible_pipelining' from source: unknown 18699 1726882343.97178: variable 'ansible_timeout' from source: unknown 18699 1726882343.97185: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882343.97604: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18699 1726882343.97608: variable 'omit' from source: magic vars 18699 1726882343.97610: starting attempt loop 18699 1726882343.97612: running the handler 18699 1726882343.97614: variable '__network_connections_result' from source: set_fact 18699 1726882343.97756: variable '__network_connections_result' from source: set_fact 18699 1726882343.98055: handler run complete 18699 1726882343.98084: attempt loop complete, returning result 18699 1726882343.98091: _execute() done 18699 1726882343.98103: dumping result to json 18699 1726882343.98153: done dumping result, returning 18699 1726882343.98164: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12673a56-9f93-1ce6-d207-00000000002c] 18699 1726882343.98172: sending task result for task 12673a56-9f93-1ce6-d207-00000000002c 18699 1726882343.98534: done sending task result for task 12673a56-9f93-1ce6-d207-00000000002c 18699 1726882343.98538: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": true, "interface_name": "lsr27", "ip": { "address": "192.0.2.1/24" }, "name": "lsr27", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[003] #0, state:up persistent_state:present, 'lsr27': add connection lsr27, a5a9140a-b936-48d0-9f96-c02df457936c\n[004] #0, state:up persistent_state:present, 'lsr27': up connection lsr27, a5a9140a-b936-48d0-9f96-c02df457936c (not-active)\n", "stderr_lines": [ "[003] #0, state:up persistent_state:present, 'lsr27': add connection lsr27, a5a9140a-b936-48d0-9f96-c02df457936c", "[004] #0, state:up persistent_state:present, 'lsr27': up connection lsr27, a5a9140a-b936-48d0-9f96-c02df457936c (not-active)" ] } } 18699 1726882343.98630: no more pending results, returning what we have 18699 1726882343.98633: results queue empty 18699 1726882343.98634: checking for any_errors_fatal 18699 1726882343.98641: done checking for any_errors_fatal 18699 1726882343.98642: checking for max_fail_percentage 18699 1726882343.98644: done checking for max_fail_percentage 18699 1726882343.98645: checking to see if all hosts have failed and the running result is not ok 18699 1726882343.98646: done checking to see if all hosts have failed 18699 1726882343.98646: getting the remaining hosts for this loop 18699 1726882343.98649: done getting the remaining hosts for this loop 18699 1726882343.98652: getting the next task for host managed_node1 18699 1726882343.98660: done getting next task for host managed_node1 18699 1726882343.98664: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 18699 1726882343.98666: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882343.98676: getting variables 18699 1726882343.98678: in VariableManager get_vars() 18699 1726882343.98923: Calling all_inventory to load vars for managed_node1 18699 1726882343.98927: Calling groups_inventory to load vars for managed_node1 18699 1726882343.98929: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882343.98939: Calling all_plugins_play to load vars for managed_node1 18699 1726882343.98943: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882343.98946: Calling groups_plugins_play to load vars for managed_node1 18699 1726882344.02180: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882344.04553: done with get_vars() 18699 1726882344.04576: done getting variables 18699 1726882344.04640: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:32:24 -0400 (0:00:00.104) 0:00:17.642 ****** 18699 1726882344.04670: entering _queue_task() for managed_node1/debug 18699 1726882344.04983: worker is 1 (out of 1 available) 18699 1726882344.05100: exiting _queue_task() for managed_node1/debug 18699 1726882344.05111: done queuing things up, now waiting for results queue to drain 18699 1726882344.05112: waiting for pending results... 18699 1726882344.05285: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 18699 1726882344.05411: in run() - task 12673a56-9f93-1ce6-d207-00000000002d 18699 1726882344.05430: variable 'ansible_search_path' from source: unknown 18699 1726882344.05437: variable 'ansible_search_path' from source: unknown 18699 1726882344.05482: calling self._execute() 18699 1726882344.05597: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882344.05603: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882344.05699: variable 'omit' from source: magic vars 18699 1726882344.06008: variable 'ansible_distribution_major_version' from source: facts 18699 1726882344.06033: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882344.06165: variable 'network_state' from source: role '' defaults 18699 1726882344.06180: Evaluated conditional (network_state != {}): False 18699 1726882344.06187: when evaluation is False, skipping this task 18699 1726882344.06199: _execute() done 18699 1726882344.06208: dumping result to json 18699 1726882344.06216: done dumping result, returning 18699 1726882344.06228: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12673a56-9f93-1ce6-d207-00000000002d] 18699 1726882344.06247: sending task result for task 12673a56-9f93-1ce6-d207-00000000002d 18699 1726882344.06411: done sending task result for task 12673a56-9f93-1ce6-d207-00000000002d 18699 1726882344.06414: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "network_state != {}" } 18699 1726882344.06464: no more pending results, returning what we have 18699 1726882344.06467: results queue empty 18699 1726882344.06469: checking for any_errors_fatal 18699 1726882344.06477: done checking for any_errors_fatal 18699 1726882344.06478: checking for max_fail_percentage 18699 1726882344.06480: done checking for max_fail_percentage 18699 1726882344.06481: checking to see if all hosts have failed and the running result is not ok 18699 1726882344.06482: done checking to see if all hosts have failed 18699 1726882344.06482: getting the remaining hosts for this loop 18699 1726882344.06484: done getting the remaining hosts for this loop 18699 1726882344.06488: getting the next task for host managed_node1 18699 1726882344.06500: done getting next task for host managed_node1 18699 1726882344.06504: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 18699 1726882344.06507: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882344.06522: getting variables 18699 1726882344.06524: in VariableManager get_vars() 18699 1726882344.06563: Calling all_inventory to load vars for managed_node1 18699 1726882344.06566: Calling groups_inventory to load vars for managed_node1 18699 1726882344.06569: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882344.06580: Calling all_plugins_play to load vars for managed_node1 18699 1726882344.06583: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882344.06586: Calling groups_plugins_play to load vars for managed_node1 18699 1726882344.08276: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882344.10185: done with get_vars() 18699 1726882344.10241: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:32:24 -0400 (0:00:00.057) 0:00:17.699 ****** 18699 1726882344.10384: entering _queue_task() for managed_node1/ping 18699 1726882344.10385: Creating lock for ping 18699 1726882344.10777: worker is 1 (out of 1 available) 18699 1726882344.10788: exiting _queue_task() for managed_node1/ping 18699 1726882344.10804: done queuing things up, now waiting for results queue to drain 18699 1726882344.10806: waiting for pending results... 18699 1726882344.11206: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 18699 1726882344.11302: in run() - task 12673a56-9f93-1ce6-d207-00000000002e 18699 1726882344.11325: variable 'ansible_search_path' from source: unknown 18699 1726882344.11333: variable 'ansible_search_path' from source: unknown 18699 1726882344.11403: calling self._execute() 18699 1726882344.11509: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882344.11515: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882344.11600: variable 'omit' from source: magic vars 18699 1726882344.11962: variable 'ansible_distribution_major_version' from source: facts 18699 1726882344.11977: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882344.11987: variable 'omit' from source: magic vars 18699 1726882344.12044: variable 'omit' from source: magic vars 18699 1726882344.12089: variable 'omit' from source: magic vars 18699 1726882344.12143: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18699 1726882344.12211: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18699 1726882344.12245: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18699 1726882344.12354: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882344.12357: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882344.12360: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18699 1726882344.12362: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882344.12366: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882344.12453: Set connection var ansible_connection to ssh 18699 1726882344.12471: Set connection var ansible_pipelining to False 18699 1726882344.12488: Set connection var ansible_shell_executable to /bin/sh 18699 1726882344.12503: Set connection var ansible_timeout to 10 18699 1726882344.12509: Set connection var ansible_shell_type to sh 18699 1726882344.12518: Set connection var ansible_module_compression to ZIP_DEFLATED 18699 1726882344.12548: variable 'ansible_shell_executable' from source: unknown 18699 1726882344.12555: variable 'ansible_connection' from source: unknown 18699 1726882344.12567: variable 'ansible_module_compression' from source: unknown 18699 1726882344.12575: variable 'ansible_shell_type' from source: unknown 18699 1726882344.12582: variable 'ansible_shell_executable' from source: unknown 18699 1726882344.12680: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882344.12683: variable 'ansible_pipelining' from source: unknown 18699 1726882344.12685: variable 'ansible_timeout' from source: unknown 18699 1726882344.12687: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882344.12863: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18699 1726882344.12879: variable 'omit' from source: magic vars 18699 1726882344.12888: starting attempt loop 18699 1726882344.12917: running the handler 18699 1726882344.12934: _low_level_execute_command(): starting 18699 1726882344.13004: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18699 1726882344.14425: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882344.14599: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882344.14913: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882344.15086: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882344.16884: stdout chunk (state=3): >>>/root <<< 18699 1726882344.17133: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882344.17140: stdout chunk (state=3): >>><<< 18699 1726882344.17150: stderr chunk (state=3): >>><<< 18699 1726882344.17203: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882344.17207: _low_level_execute_command(): starting 18699 1726882344.17420: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882344.1717038-19564-83954638568055 `" && echo ansible-tmp-1726882344.1717038-19564-83954638568055="` echo /root/.ansible/tmp/ansible-tmp-1726882344.1717038-19564-83954638568055 `" ) && sleep 0' 18699 1726882344.19501: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882344.19505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882344.19826: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882344.19999: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882344.20106: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882344.22513: stdout chunk (state=3): >>>ansible-tmp-1726882344.1717038-19564-83954638568055=/root/.ansible/tmp/ansible-tmp-1726882344.1717038-19564-83954638568055 <<< 18699 1726882344.22517: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882344.22522: stdout chunk (state=3): >>><<< 18699 1726882344.22525: stderr chunk (state=3): >>><<< 18699 1726882344.22544: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882344.1717038-19564-83954638568055=/root/.ansible/tmp/ansible-tmp-1726882344.1717038-19564-83954638568055 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882344.22600: variable 'ansible_module_compression' from source: unknown 18699 1726882344.22636: ANSIBALLZ: Using lock for ping 18699 1726882344.22640: ANSIBALLZ: Acquiring lock 18699 1726882344.22642: ANSIBALLZ: Lock acquired: 140254445679872 18699 1726882344.22645: ANSIBALLZ: Creating module 18699 1726882344.46080: ANSIBALLZ: Writing module into payload 18699 1726882344.46131: ANSIBALLZ: Writing module 18699 1726882344.46207: ANSIBALLZ: Renaming module 18699 1726882344.46412: ANSIBALLZ: Done creating module 18699 1726882344.46415: variable 'ansible_facts' from source: unknown 18699 1726882344.46418: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882344.1717038-19564-83954638568055/AnsiballZ_ping.py 18699 1726882344.46932: Sending initial data 18699 1726882344.46951: Sent initial data (152 bytes) 18699 1726882344.48061: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882344.48078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882344.48158: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882344.48265: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882344.48324: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882344.49925: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18699 1726882344.49963: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18699 1726882344.50011: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmpwp9co50a /root/.ansible/tmp/ansible-tmp-1726882344.1717038-19564-83954638568055/AnsiballZ_ping.py <<< 18699 1726882344.50056: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882344.1717038-19564-83954638568055/AnsiballZ_ping.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmpwp9co50a" to remote "/root/.ansible/tmp/ansible-tmp-1726882344.1717038-19564-83954638568055/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882344.1717038-19564-83954638568055/AnsiballZ_ping.py" <<< 18699 1726882344.52664: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882344.52671: stdout chunk (state=3): >>><<< 18699 1726882344.52673: stderr chunk (state=3): >>><<< 18699 1726882344.52675: done transferring module to remote 18699 1726882344.52677: _low_level_execute_command(): starting 18699 1726882344.52680: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882344.1717038-19564-83954638568055/ /root/.ansible/tmp/ansible-tmp-1726882344.1717038-19564-83954638568055/AnsiballZ_ping.py && sleep 0' 18699 1726882344.54184: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882344.54306: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882344.54321: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882344.54408: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882344.56236: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882344.56315: stderr chunk (state=3): >>><<< 18699 1726882344.56323: stdout chunk (state=3): >>><<< 18699 1726882344.56345: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882344.56368: _low_level_execute_command(): starting 18699 1726882344.56376: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882344.1717038-19564-83954638568055/AnsiballZ_ping.py && sleep 0' 18699 1726882344.57115: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882344.57158: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882344.57184: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882344.57275: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882344.72421: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 18699 1726882344.73705: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 18699 1726882344.73741: stdout chunk (state=3): >>><<< 18699 1726882344.73745: stderr chunk (state=3): >>><<< 18699 1726882344.73769: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 18699 1726882344.73804: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882344.1717038-19564-83954638568055/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18699 1726882344.73989: _low_level_execute_command(): starting 18699 1726882344.73997: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882344.1717038-19564-83954638568055/ > /dev/null 2>&1 && sleep 0' 18699 1726882344.74636: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882344.74649: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882344.74669: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882344.74687: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18699 1726882344.74711: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 18699 1726882344.74723: stderr chunk (state=3): >>>debug2: match not found <<< 18699 1726882344.74738: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882344.74773: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882344.74872: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882344.74935: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882344.74987: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882344.76978: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882344.76982: stdout chunk (state=3): >>><<< 18699 1726882344.76984: stderr chunk (state=3): >>><<< 18699 1726882344.76987: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882344.76989: handler run complete 18699 1726882344.76991: attempt loop complete, returning result 18699 1726882344.77002: _execute() done 18699 1726882344.77004: dumping result to json 18699 1726882344.77010: done dumping result, returning 18699 1726882344.77012: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [12673a56-9f93-1ce6-d207-00000000002e] 18699 1726882344.77302: sending task result for task 12673a56-9f93-1ce6-d207-00000000002e 18699 1726882344.77377: done sending task result for task 12673a56-9f93-1ce6-d207-00000000002e 18699 1726882344.77801: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "ping": "pong" } 18699 1726882344.77826: no more pending results, returning what we have 18699 1726882344.77829: results queue empty 18699 1726882344.77830: checking for any_errors_fatal 18699 1726882344.77834: done checking for any_errors_fatal 18699 1726882344.77835: checking for max_fail_percentage 18699 1726882344.77837: done checking for max_fail_percentage 18699 1726882344.77838: checking to see if all hosts have failed and the running result is not ok 18699 1726882344.77838: done checking to see if all hosts have failed 18699 1726882344.77839: getting the remaining hosts for this loop 18699 1726882344.77841: done getting the remaining hosts for this loop 18699 1726882344.77844: getting the next task for host managed_node1 18699 1726882344.77851: done getting next task for host managed_node1 18699 1726882344.77853: ^ task is: TASK: meta (role_complete) 18699 1726882344.77854: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882344.77862: getting variables 18699 1726882344.77864: in VariableManager get_vars() 18699 1726882344.77902: Calling all_inventory to load vars for managed_node1 18699 1726882344.77905: Calling groups_inventory to load vars for managed_node1 18699 1726882344.77908: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882344.77918: Calling all_plugins_play to load vars for managed_node1 18699 1726882344.77920: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882344.77922: Calling groups_plugins_play to load vars for managed_node1 18699 1726882344.81814: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882344.86070: done with get_vars() 18699 1726882344.86328: done getting variables 18699 1726882344.86511: done queuing things up, now waiting for results queue to drain 18699 1726882344.86513: results queue empty 18699 1726882344.86514: checking for any_errors_fatal 18699 1726882344.86517: done checking for any_errors_fatal 18699 1726882344.86518: checking for max_fail_percentage 18699 1726882344.86519: done checking for max_fail_percentage 18699 1726882344.86519: checking to see if all hosts have failed and the running result is not ok 18699 1726882344.86520: done checking to see if all hosts have failed 18699 1726882344.86521: getting the remaining hosts for this loop 18699 1726882344.86522: done getting the remaining hosts for this loop 18699 1726882344.86525: getting the next task for host managed_node1 18699 1726882344.86529: done getting next task for host managed_node1 18699 1726882344.86532: ^ task is: TASK: Include the task 'assert_output_in_stderr_without_warnings.yml' 18699 1726882344.86597: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882344.86601: getting variables 18699 1726882344.86602: in VariableManager get_vars() 18699 1726882344.86617: Calling all_inventory to load vars for managed_node1 18699 1726882344.86619: Calling groups_inventory to load vars for managed_node1 18699 1726882344.86621: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882344.86627: Calling all_plugins_play to load vars for managed_node1 18699 1726882344.86629: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882344.86632: Calling groups_plugins_play to load vars for managed_node1 18699 1726882344.88186: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882344.91522: done with get_vars() 18699 1726882344.91548: done getting variables TASK [Include the task 'assert_output_in_stderr_without_warnings.yml'] ********* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:47 Friday 20 September 2024 21:32:24 -0400 (0:00:00.814) 0:00:18.514 ****** 18699 1726882344.91850: entering _queue_task() for managed_node1/include_tasks 18699 1726882344.92847: worker is 1 (out of 1 available) 18699 1726882344.92861: exiting _queue_task() for managed_node1/include_tasks 18699 1726882344.92873: done queuing things up, now waiting for results queue to drain 18699 1726882344.92987: waiting for pending results... 18699 1726882344.93373: running TaskExecutor() for managed_node1/TASK: Include the task 'assert_output_in_stderr_without_warnings.yml' 18699 1726882344.93759: in run() - task 12673a56-9f93-1ce6-d207-000000000030 18699 1726882344.94202: variable 'ansible_search_path' from source: unknown 18699 1726882344.94207: calling self._execute() 18699 1726882344.94255: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882344.94309: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882344.94616: variable 'omit' from source: magic vars 18699 1726882344.95292: variable 'ansible_distribution_major_version' from source: facts 18699 1726882344.95315: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882344.95327: _execute() done 18699 1726882344.95336: dumping result to json 18699 1726882344.95344: done dumping result, returning 18699 1726882344.95354: done running TaskExecutor() for managed_node1/TASK: Include the task 'assert_output_in_stderr_without_warnings.yml' [12673a56-9f93-1ce6-d207-000000000030] 18699 1726882344.95362: sending task result for task 12673a56-9f93-1ce6-d207-000000000030 18699 1726882344.95492: no more pending results, returning what we have 18699 1726882344.95502: in VariableManager get_vars() 18699 1726882344.95618: Calling all_inventory to load vars for managed_node1 18699 1726882344.95621: Calling groups_inventory to load vars for managed_node1 18699 1726882344.95624: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882344.95636: Calling all_plugins_play to load vars for managed_node1 18699 1726882344.95639: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882344.95641: Calling groups_plugins_play to load vars for managed_node1 18699 1726882344.96273: done sending task result for task 12673a56-9f93-1ce6-d207-000000000030 18699 1726882344.96276: WORKER PROCESS EXITING 18699 1726882344.99966: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882345.02066: done with get_vars() 18699 1726882345.02086: variable 'ansible_search_path' from source: unknown 18699 1726882345.02106: we have included files to process 18699 1726882345.02107: generating all_blocks data 18699 1726882345.02109: done generating all_blocks data 18699 1726882345.02113: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_output_in_stderr_without_warnings.yml 18699 1726882345.02114: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_output_in_stderr_without_warnings.yml 18699 1726882345.02117: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_output_in_stderr_without_warnings.yml 18699 1726882345.02761: done processing included file 18699 1726882345.02763: iterating over new_blocks loaded from include file 18699 1726882345.02765: in VariableManager get_vars() 18699 1726882345.02792: done with get_vars() 18699 1726882345.02798: filtering new block on tags 18699 1726882345.02816: done filtering new block on tags 18699 1726882345.02818: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_output_in_stderr_without_warnings.yml for managed_node1 18699 1726882345.02824: extending task lists for all hosts with included blocks 18699 1726882345.02856: done extending task lists 18699 1726882345.02857: done processing included files 18699 1726882345.02858: results queue empty 18699 1726882345.02859: checking for any_errors_fatal 18699 1726882345.02860: done checking for any_errors_fatal 18699 1726882345.02861: checking for max_fail_percentage 18699 1726882345.02862: done checking for max_fail_percentage 18699 1726882345.02863: checking to see if all hosts have failed and the running result is not ok 18699 1726882345.02864: done checking to see if all hosts have failed 18699 1726882345.02865: getting the remaining hosts for this loop 18699 1726882345.02866: done getting the remaining hosts for this loop 18699 1726882345.02868: getting the next task for host managed_node1 18699 1726882345.02872: done getting next task for host managed_node1 18699 1726882345.02874: ^ task is: TASK: Assert that warnings is empty 18699 1726882345.02877: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882345.02879: getting variables 18699 1726882345.02880: in VariableManager get_vars() 18699 1726882345.02917: Calling all_inventory to load vars for managed_node1 18699 1726882345.02919: Calling groups_inventory to load vars for managed_node1 18699 1726882345.02922: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882345.02927: Calling all_plugins_play to load vars for managed_node1 18699 1726882345.02930: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882345.02933: Calling groups_plugins_play to load vars for managed_node1 18699 1726882345.05182: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882345.06820: done with get_vars() 18699 1726882345.06880: done getting variables 18699 1726882345.06930: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that warnings is empty] ******************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_output_in_stderr_without_warnings.yml:3 Friday 20 September 2024 21:32:25 -0400 (0:00:00.151) 0:00:18.666 ****** 18699 1726882345.07024: entering _queue_task() for managed_node1/assert 18699 1726882345.07811: worker is 1 (out of 1 available) 18699 1726882345.07829: exiting _queue_task() for managed_node1/assert 18699 1726882345.07842: done queuing things up, now waiting for results queue to drain 18699 1726882345.07843: waiting for pending results... 18699 1726882345.08236: running TaskExecutor() for managed_node1/TASK: Assert that warnings is empty 18699 1726882345.08502: in run() - task 12673a56-9f93-1ce6-d207-000000000304 18699 1726882345.08507: variable 'ansible_search_path' from source: unknown 18699 1726882345.08509: variable 'ansible_search_path' from source: unknown 18699 1726882345.08614: calling self._execute() 18699 1726882345.08800: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882345.08813: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882345.08835: variable 'omit' from source: magic vars 18699 1726882345.09576: variable 'ansible_distribution_major_version' from source: facts 18699 1726882345.09843: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882345.09847: variable 'omit' from source: magic vars 18699 1726882345.09849: variable 'omit' from source: magic vars 18699 1726882345.09852: variable 'omit' from source: magic vars 18699 1726882345.09889: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18699 1726882345.09938: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18699 1726882345.10000: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18699 1726882345.10003: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882345.10015: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882345.10055: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18699 1726882345.10068: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882345.10075: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882345.10187: Set connection var ansible_connection to ssh 18699 1726882345.10203: Set connection var ansible_pipelining to False 18699 1726882345.10214: Set connection var ansible_shell_executable to /bin/sh 18699 1726882345.10224: Set connection var ansible_timeout to 10 18699 1726882345.10236: Set connection var ansible_shell_type to sh 18699 1726882345.10246: Set connection var ansible_module_compression to ZIP_DEFLATED 18699 1726882345.10343: variable 'ansible_shell_executable' from source: unknown 18699 1726882345.10346: variable 'ansible_connection' from source: unknown 18699 1726882345.10349: variable 'ansible_module_compression' from source: unknown 18699 1726882345.10351: variable 'ansible_shell_type' from source: unknown 18699 1726882345.10353: variable 'ansible_shell_executable' from source: unknown 18699 1726882345.10354: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882345.10356: variable 'ansible_pipelining' from source: unknown 18699 1726882345.10358: variable 'ansible_timeout' from source: unknown 18699 1726882345.10360: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882345.10552: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18699 1726882345.10596: variable 'omit' from source: magic vars 18699 1726882345.10599: starting attempt loop 18699 1726882345.10601: running the handler 18699 1726882345.10730: variable '__network_connections_result' from source: set_fact 18699 1726882345.10748: Evaluated conditional ('warnings' not in __network_connections_result): True 18699 1726882345.10758: handler run complete 18699 1726882345.10837: attempt loop complete, returning result 18699 1726882345.10840: _execute() done 18699 1726882345.10842: dumping result to json 18699 1726882345.10844: done dumping result, returning 18699 1726882345.10846: done running TaskExecutor() for managed_node1/TASK: Assert that warnings is empty [12673a56-9f93-1ce6-d207-000000000304] 18699 1726882345.10849: sending task result for task 12673a56-9f93-1ce6-d207-000000000304 18699 1726882345.11048: done sending task result for task 12673a56-9f93-1ce6-d207-000000000304 18699 1726882345.11052: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 18699 1726882345.11105: no more pending results, returning what we have 18699 1726882345.11109: results queue empty 18699 1726882345.11110: checking for any_errors_fatal 18699 1726882345.11112: done checking for any_errors_fatal 18699 1726882345.11112: checking for max_fail_percentage 18699 1726882345.11114: done checking for max_fail_percentage 18699 1726882345.11115: checking to see if all hosts have failed and the running result is not ok 18699 1726882345.11116: done checking to see if all hosts have failed 18699 1726882345.11116: getting the remaining hosts for this loop 18699 1726882345.11119: done getting the remaining hosts for this loop 18699 1726882345.11122: getting the next task for host managed_node1 18699 1726882345.11128: done getting next task for host managed_node1 18699 1726882345.11130: ^ task is: TASK: Assert that there is output in stderr 18699 1726882345.11133: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882345.11137: getting variables 18699 1726882345.11139: in VariableManager get_vars() 18699 1726882345.11175: Calling all_inventory to load vars for managed_node1 18699 1726882345.11178: Calling groups_inventory to load vars for managed_node1 18699 1726882345.11180: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882345.11190: Calling all_plugins_play to load vars for managed_node1 18699 1726882345.11196: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882345.11200: Calling groups_plugins_play to load vars for managed_node1 18699 1726882345.24155: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882345.28757: done with get_vars() 18699 1726882345.29011: done getting variables 18699 1726882345.29063: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that there is output in stderr] *********************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_output_in_stderr_without_warnings.yml:8 Friday 20 September 2024 21:32:25 -0400 (0:00:00.220) 0:00:18.886 ****** 18699 1726882345.29090: entering _queue_task() for managed_node1/assert 18699 1726882345.30302: worker is 1 (out of 1 available) 18699 1726882345.30316: exiting _queue_task() for managed_node1/assert 18699 1726882345.30326: done queuing things up, now waiting for results queue to drain 18699 1726882345.30327: waiting for pending results... 18699 1726882345.30697: running TaskExecutor() for managed_node1/TASK: Assert that there is output in stderr 18699 1726882345.31004: in run() - task 12673a56-9f93-1ce6-d207-000000000305 18699 1726882345.31008: variable 'ansible_search_path' from source: unknown 18699 1726882345.31011: variable 'ansible_search_path' from source: unknown 18699 1726882345.31014: calling self._execute() 18699 1726882345.31351: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882345.31355: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882345.31358: variable 'omit' from source: magic vars 18699 1726882345.32602: variable 'ansible_distribution_major_version' from source: facts 18699 1726882345.32606: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882345.32609: variable 'omit' from source: magic vars 18699 1726882345.32612: variable 'omit' from source: magic vars 18699 1726882345.32760: variable 'omit' from source: magic vars 18699 1726882345.32854: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18699 1726882345.33063: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18699 1726882345.33070: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18699 1726882345.33098: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882345.33280: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882345.33284: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18699 1726882345.33287: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882345.33290: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882345.33574: Set connection var ansible_connection to ssh 18699 1726882345.33932: Set connection var ansible_pipelining to False 18699 1726882345.33936: Set connection var ansible_shell_executable to /bin/sh 18699 1726882345.33938: Set connection var ansible_timeout to 10 18699 1726882345.33941: Set connection var ansible_shell_type to sh 18699 1726882345.33943: Set connection var ansible_module_compression to ZIP_DEFLATED 18699 1726882345.33946: variable 'ansible_shell_executable' from source: unknown 18699 1726882345.33948: variable 'ansible_connection' from source: unknown 18699 1726882345.33950: variable 'ansible_module_compression' from source: unknown 18699 1726882345.33952: variable 'ansible_shell_type' from source: unknown 18699 1726882345.33954: variable 'ansible_shell_executable' from source: unknown 18699 1726882345.33956: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882345.33958: variable 'ansible_pipelining' from source: unknown 18699 1726882345.33959: variable 'ansible_timeout' from source: unknown 18699 1726882345.33961: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882345.34400: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18699 1726882345.34417: variable 'omit' from source: magic vars 18699 1726882345.34484: starting attempt loop 18699 1726882345.34492: running the handler 18699 1726882345.34853: variable '__network_connections_result' from source: set_fact 18699 1726882345.35130: Evaluated conditional ('stderr' in __network_connections_result): True 18699 1726882345.35133: handler run complete 18699 1726882345.35136: attempt loop complete, returning result 18699 1726882345.35347: _execute() done 18699 1726882345.35351: dumping result to json 18699 1726882345.35353: done dumping result, returning 18699 1726882345.35355: done running TaskExecutor() for managed_node1/TASK: Assert that there is output in stderr [12673a56-9f93-1ce6-d207-000000000305] 18699 1726882345.35357: sending task result for task 12673a56-9f93-1ce6-d207-000000000305 18699 1726882345.35427: done sending task result for task 12673a56-9f93-1ce6-d207-000000000305 18699 1726882345.35431: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 18699 1726882345.35482: no more pending results, returning what we have 18699 1726882345.35486: results queue empty 18699 1726882345.35487: checking for any_errors_fatal 18699 1726882345.35501: done checking for any_errors_fatal 18699 1726882345.35502: checking for max_fail_percentage 18699 1726882345.35505: done checking for max_fail_percentage 18699 1726882345.35506: checking to see if all hosts have failed and the running result is not ok 18699 1726882345.35507: done checking to see if all hosts have failed 18699 1726882345.35508: getting the remaining hosts for this loop 18699 1726882345.35509: done getting the remaining hosts for this loop 18699 1726882345.35513: getting the next task for host managed_node1 18699 1726882345.35523: done getting next task for host managed_node1 18699 1726882345.35526: ^ task is: TASK: meta (flush_handlers) 18699 1726882345.35528: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882345.35533: getting variables 18699 1726882345.35535: in VariableManager get_vars() 18699 1726882345.35575: Calling all_inventory to load vars for managed_node1 18699 1726882345.35578: Calling groups_inventory to load vars for managed_node1 18699 1726882345.35580: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882345.35591: Calling all_plugins_play to load vars for managed_node1 18699 1726882345.35799: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882345.35806: Calling groups_plugins_play to load vars for managed_node1 18699 1726882345.39203: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882345.41009: done with get_vars() 18699 1726882345.41033: done getting variables 18699 1726882345.41108: in VariableManager get_vars() 18699 1726882345.41121: Calling all_inventory to load vars for managed_node1 18699 1726882345.41123: Calling groups_inventory to load vars for managed_node1 18699 1726882345.41125: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882345.41130: Calling all_plugins_play to load vars for managed_node1 18699 1726882345.41132: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882345.41134: Calling groups_plugins_play to load vars for managed_node1 18699 1726882345.43959: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882345.47571: done with get_vars() 18699 1726882345.47613: done queuing things up, now waiting for results queue to drain 18699 1726882345.47616: results queue empty 18699 1726882345.47617: checking for any_errors_fatal 18699 1726882345.47619: done checking for any_errors_fatal 18699 1726882345.47620: checking for max_fail_percentage 18699 1726882345.47622: done checking for max_fail_percentage 18699 1726882345.47623: checking to see if all hosts have failed and the running result is not ok 18699 1726882345.47624: done checking to see if all hosts have failed 18699 1726882345.47624: getting the remaining hosts for this loop 18699 1726882345.47631: done getting the remaining hosts for this loop 18699 1726882345.47634: getting the next task for host managed_node1 18699 1726882345.47638: done getting next task for host managed_node1 18699 1726882345.47640: ^ task is: TASK: meta (flush_handlers) 18699 1726882345.47641: ^ state is: HOST STATE: block=6, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882345.47644: getting variables 18699 1726882345.47645: in VariableManager get_vars() 18699 1726882345.47702: Calling all_inventory to load vars for managed_node1 18699 1726882345.47705: Calling groups_inventory to load vars for managed_node1 18699 1726882345.47707: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882345.47713: Calling all_plugins_play to load vars for managed_node1 18699 1726882345.47715: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882345.47717: Calling groups_plugins_play to load vars for managed_node1 18699 1726882345.50536: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882345.54542: done with get_vars() 18699 1726882345.54563: done getting variables 18699 1726882345.54758: in VariableManager get_vars() 18699 1726882345.54772: Calling all_inventory to load vars for managed_node1 18699 1726882345.54774: Calling groups_inventory to load vars for managed_node1 18699 1726882345.54777: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882345.54782: Calling all_plugins_play to load vars for managed_node1 18699 1726882345.54784: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882345.54787: Calling groups_plugins_play to load vars for managed_node1 18699 1726882345.56004: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882345.58809: done with get_vars() 18699 1726882345.58939: done queuing things up, now waiting for results queue to drain 18699 1726882345.58942: results queue empty 18699 1726882345.58945: checking for any_errors_fatal 18699 1726882345.58946: done checking for any_errors_fatal 18699 1726882345.58947: checking for max_fail_percentage 18699 1726882345.58948: done checking for max_fail_percentage 18699 1726882345.58949: checking to see if all hosts have failed and the running result is not ok 18699 1726882345.58950: done checking to see if all hosts have failed 18699 1726882345.58951: getting the remaining hosts for this loop 18699 1726882345.58952: done getting the remaining hosts for this loop 18699 1726882345.58955: getting the next task for host managed_node1 18699 1726882345.58959: done getting next task for host managed_node1 18699 1726882345.58960: ^ task is: None 18699 1726882345.58961: ^ state is: HOST STATE: block=7, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882345.58962: done queuing things up, now waiting for results queue to drain 18699 1726882345.58964: results queue empty 18699 1726882345.58964: checking for any_errors_fatal 18699 1726882345.58967: done checking for any_errors_fatal 18699 1726882345.59010: checking for max_fail_percentage 18699 1726882345.59011: done checking for max_fail_percentage 18699 1726882345.59012: checking to see if all hosts have failed and the running result is not ok 18699 1726882345.59013: done checking to see if all hosts have failed 18699 1726882345.59014: getting the next task for host managed_node1 18699 1726882345.59017: done getting next task for host managed_node1 18699 1726882345.59022: ^ task is: None 18699 1726882345.59024: ^ state is: HOST STATE: block=7, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882345.59130: in VariableManager get_vars() 18699 1726882345.59148: done with get_vars() 18699 1726882345.59154: in VariableManager get_vars() 18699 1726882345.59165: done with get_vars() 18699 1726882345.59174: variable 'omit' from source: magic vars 18699 1726882345.59430: in VariableManager get_vars() 18699 1726882345.59442: done with get_vars() 18699 1726882345.59465: variable 'omit' from source: magic vars PLAY [Play for cleaning up the test device and the connection profile] ********* 18699 1726882345.60272: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 18699 1726882345.60502: getting the remaining hosts for this loop 18699 1726882345.60504: done getting the remaining hosts for this loop 18699 1726882345.60507: getting the next task for host managed_node1 18699 1726882345.60510: done getting next task for host managed_node1 18699 1726882345.60512: ^ task is: TASK: Gathering Facts 18699 1726882345.60513: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882345.60515: getting variables 18699 1726882345.60516: in VariableManager get_vars() 18699 1726882345.60526: Calling all_inventory to load vars for managed_node1 18699 1726882345.60528: Calling groups_inventory to load vars for managed_node1 18699 1726882345.60531: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882345.60537: Calling all_plugins_play to load vars for managed_node1 18699 1726882345.60540: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882345.60543: Calling groups_plugins_play to load vars for managed_node1 18699 1726882345.63073: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882345.64636: done with get_vars() 18699 1726882345.64661: done getting variables 18699 1726882345.64714: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:50 Friday 20 September 2024 21:32:25 -0400 (0:00:00.356) 0:00:19.243 ****** 18699 1726882345.64742: entering _queue_task() for managed_node1/gather_facts 18699 1726882345.65074: worker is 1 (out of 1 available) 18699 1726882345.65085: exiting _queue_task() for managed_node1/gather_facts 18699 1726882345.65099: done queuing things up, now waiting for results queue to drain 18699 1726882345.65100: waiting for pending results... 18699 1726882345.65361: running TaskExecutor() for managed_node1/TASK: Gathering Facts 18699 1726882345.65702: in run() - task 12673a56-9f93-1ce6-d207-000000000316 18699 1726882345.65707: variable 'ansible_search_path' from source: unknown 18699 1726882345.65711: calling self._execute() 18699 1726882345.65714: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882345.65717: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882345.65719: variable 'omit' from source: magic vars 18699 1726882345.66202: variable 'ansible_distribution_major_version' from source: facts 18699 1726882345.66219: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882345.66228: variable 'omit' from source: magic vars 18699 1726882345.66254: variable 'omit' from source: magic vars 18699 1726882345.66310: variable 'omit' from source: magic vars 18699 1726882345.66356: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18699 1726882345.66414: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18699 1726882345.66442: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18699 1726882345.66464: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882345.66497: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882345.66539: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18699 1726882345.66551: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882345.66559: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882345.66680: Set connection var ansible_connection to ssh 18699 1726882345.66707: Set connection var ansible_pipelining to False 18699 1726882345.66710: Set connection var ansible_shell_executable to /bin/sh 18699 1726882345.66798: Set connection var ansible_timeout to 10 18699 1726882345.66801: Set connection var ansible_shell_type to sh 18699 1726882345.66803: Set connection var ansible_module_compression to ZIP_DEFLATED 18699 1726882345.66805: variable 'ansible_shell_executable' from source: unknown 18699 1726882345.66807: variable 'ansible_connection' from source: unknown 18699 1726882345.66811: variable 'ansible_module_compression' from source: unknown 18699 1726882345.66813: variable 'ansible_shell_type' from source: unknown 18699 1726882345.66815: variable 'ansible_shell_executable' from source: unknown 18699 1726882345.66819: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882345.66822: variable 'ansible_pipelining' from source: unknown 18699 1726882345.66824: variable 'ansible_timeout' from source: unknown 18699 1726882345.66826: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882345.67158: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18699 1726882345.67164: variable 'omit' from source: magic vars 18699 1726882345.67169: starting attempt loop 18699 1726882345.67171: running the handler 18699 1726882345.67173: variable 'ansible_facts' from source: unknown 18699 1726882345.67177: _low_level_execute_command(): starting 18699 1726882345.67198: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18699 1726882345.68766: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882345.68910: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882345.68945: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882345.70654: stdout chunk (state=3): >>>/root <<< 18699 1726882345.70790: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882345.70823: stdout chunk (state=3): >>><<< 18699 1726882345.70826: stderr chunk (state=3): >>><<< 18699 1726882345.70849: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882345.70950: _low_level_execute_command(): starting 18699 1726882345.70954: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882345.7085614-19645-170801521239259 `" && echo ansible-tmp-1726882345.7085614-19645-170801521239259="` echo /root/.ansible/tmp/ansible-tmp-1726882345.7085614-19645-170801521239259 `" ) && sleep 0' 18699 1726882345.71485: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882345.71510: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882345.71525: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882345.71543: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18699 1726882345.71562: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 18699 1726882345.71575: stderr chunk (state=3): >>>debug2: match not found <<< 18699 1726882345.71590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882345.72002: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882345.72019: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882345.72102: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882345.74005: stdout chunk (state=3): >>>ansible-tmp-1726882345.7085614-19645-170801521239259=/root/.ansible/tmp/ansible-tmp-1726882345.7085614-19645-170801521239259 <<< 18699 1726882345.74138: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882345.74529: stdout chunk (state=3): >>><<< 18699 1726882345.74535: stderr chunk (state=3): >>><<< 18699 1726882345.74538: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882345.7085614-19645-170801521239259=/root/.ansible/tmp/ansible-tmp-1726882345.7085614-19645-170801521239259 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882345.74541: variable 'ansible_module_compression' from source: unknown 18699 1726882345.74543: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18699f6i6z5dg/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 18699 1726882345.74546: variable 'ansible_facts' from source: unknown 18699 1726882345.74950: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882345.7085614-19645-170801521239259/AnsiballZ_setup.py 18699 1726882345.75421: Sending initial data 18699 1726882345.75424: Sent initial data (154 bytes) 18699 1726882345.76316: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882345.76325: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882345.76336: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882345.76352: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18699 1726882345.76365: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 18699 1726882345.76368: stderr chunk (state=3): >>>debug2: match not found <<< 18699 1726882345.76383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882345.76403: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18699 1726882345.76412: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 18699 1726882345.76419: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 18699 1726882345.76431: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882345.76500: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882345.76520: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882345.76533: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882345.76541: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882345.76713: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882345.78294: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18699 1726882345.78398: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18699 1726882345.78541: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmpex83yfga /root/.ansible/tmp/ansible-tmp-1726882345.7085614-19645-170801521239259/AnsiballZ_setup.py <<< 18699 1726882345.78545: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882345.7085614-19645-170801521239259/AnsiballZ_setup.py" <<< 18699 1726882345.78582: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmpex83yfga" to remote "/root/.ansible/tmp/ansible-tmp-1726882345.7085614-19645-170801521239259/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882345.7085614-19645-170801521239259/AnsiballZ_setup.py" <<< 18699 1726882345.81382: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882345.81392: stdout chunk (state=3): >>><<< 18699 1726882345.81604: stderr chunk (state=3): >>><<< 18699 1726882345.81607: done transferring module to remote 18699 1726882345.81609: _low_level_execute_command(): starting 18699 1726882345.81612: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882345.7085614-19645-170801521239259/ /root/.ansible/tmp/ansible-tmp-1726882345.7085614-19645-170801521239259/AnsiballZ_setup.py && sleep 0' 18699 1726882345.82760: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882345.82913: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882345.83253: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882345.83270: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882345.83344: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882345.85192: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882345.85217: stdout chunk (state=3): >>><<< 18699 1726882345.85232: stderr chunk (state=3): >>><<< 18699 1726882345.85506: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882345.85509: _low_level_execute_command(): starting 18699 1726882345.85512: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882345.7085614-19645-170801521239259/AnsiballZ_setup.py && sleep 0' 18699 1726882345.86624: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882345.86685: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882345.86841: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882345.86861: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882345.87020: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882346.52537: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-159", "ansible_nodename": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d2d02cced42c36436217cb93f6b8e", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC9sgyYGKGPd0JFIDKIZZNkcX78Ca8OmX4GnOCt150Ftpgzzfir9Dy2HOb7d6QbQheoi9HLkHb66U2LDdt7EnBGKnI12YAuydTDfqITc2L4W9cEeoy/f2rrMlBo6FN3SNQc2voCDsWius2gK2mtTTZZI0R33PguMmqTkwYVzP0hYplwSYh5Atl+XP7/xLRhhowanh9U6x2ahqfnNq5DInqi070bKk0xZ2g12Vg8kIRno8ZQmm+ujUUevRkZysHvnrnN01ZQhqzjo/Awn+Pft6LYleTBn+YU/HlPMWR4PsFcrtT3WRdF5samSvVwWuuOC+0td2zQN4nGpYLK+FmpNG4nDfGZV/xIBBblNRvzrhKgk3lDU5qkeQ/R0godRQGbv4J1kq+3WU2E3upqBYxXWUJLM5FirAxz8tKLmaPh8YZWMKcs3X9F2ySLEcnhe5R5F6LFSNx13zQSt7lGZOIgzhvWllcs4YVxcV1Y4rTJ8jEK2KgWua+bZinJPYUJqKTzO2E=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKk0X8hfHP7BSAAI8BDwrr4175ddN6MsanEqlp3oVMOvThKVXLpFXhvJPbq2IBTd3Wm12dL2vAW7/82zG63KYZk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDVN13dHSxa36Blsqt/Q8OyOA04CC7ZlvrS6zWL4aDyE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_iscsi_iqn": "", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_lsb": {}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 52586 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 52586 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_local": {}, "ansible_loadavg": {"1m": 0.45654296875, "5m": 0.32373046875, "15m": 0.15966796875}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "32", "second": "26", "epoch": "1726882346", "epoch_int": "1726882346", "date": "2024-09-20", "time": "21:32:26", "iso8601_micro": "2024-09-21T01:32:26.142923Z", "iso8601": "2024-09-21T01:32:26Z", "iso8601_basic": "20240920T213226142923", "iso8601_basic_short": "20240920T213226", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fips": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.16<<< 18699 1726882346.52602: stdout chunk (state=3): >>>9.13", "10.29.170.12", "10.2.32.1"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_fibre_channel_wwn": [], "ansible_is_chroot": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2962, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 569, "free": 2962}, "nocache": {"free": 3300, "used": 231}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_uuid": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 779, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794799616, "block_size": 4096, "block_total": 65519099, "block_available": 63914746, "block_used": 1604353, "inode_total": 131070960, "inode_available": 131029046, "inode_used": 41914, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_apparmor": {"status": "disabled"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_interfaces": ["lsr27", "eth0", "lo", "peerlsr27"], "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "c2:5a:92:97:66:08", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::c05a:92ff:fe97:6608", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "<<< 18699 1726882346.52720: stdout chunk (state=3): >>>on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1030:bff:fea1:4223", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lsr27": {"device": "lsr27", "macaddress": "7a:fe:b4:01:4b:ee", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv4": {"address": "192.0.2.1", "broadcast": "192.0.2.255", "netmask": "255.255.255.0", "network": "192.0.2.0", "prefix": "24"}, "ipv6": [{"address": "fe80::78fe:b4ff:fe01:4bee", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159", "192.0.2.1"], "ansible_all_ipv6_addresses": ["fe80::c05a:92ff:fe97:6608", "fe80::1030:bff:fea1:4223", "fe80::78fe:b4ff:fe01:4bee"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1", "192.0.2.1"], "ipv6": ["::1", "fe80::1030:bff:fea1:4223", "fe80::78fe:b4ff:fe01:4bee", "fe80::c05a:92ff:fe97:6608"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 18699 1726882346.54600: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 18699 1726882346.54611: stdout chunk (state=3): >>><<< 18699 1726882346.54614: stderr chunk (state=3): >>><<< 18699 1726882346.54672: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-159", "ansible_nodename": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d2d02cced42c36436217cb93f6b8e", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC9sgyYGKGPd0JFIDKIZZNkcX78Ca8OmX4GnOCt150Ftpgzzfir9Dy2HOb7d6QbQheoi9HLkHb66U2LDdt7EnBGKnI12YAuydTDfqITc2L4W9cEeoy/f2rrMlBo6FN3SNQc2voCDsWius2gK2mtTTZZI0R33PguMmqTkwYVzP0hYplwSYh5Atl+XP7/xLRhhowanh9U6x2ahqfnNq5DInqi070bKk0xZ2g12Vg8kIRno8ZQmm+ujUUevRkZysHvnrnN01ZQhqzjo/Awn+Pft6LYleTBn+YU/HlPMWR4PsFcrtT3WRdF5samSvVwWuuOC+0td2zQN4nGpYLK+FmpNG4nDfGZV/xIBBblNRvzrhKgk3lDU5qkeQ/R0godRQGbv4J1kq+3WU2E3upqBYxXWUJLM5FirAxz8tKLmaPh8YZWMKcs3X9F2ySLEcnhe5R5F6LFSNx13zQSt7lGZOIgzhvWllcs4YVxcV1Y4rTJ8jEK2KgWua+bZinJPYUJqKTzO2E=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKk0X8hfHP7BSAAI8BDwrr4175ddN6MsanEqlp3oVMOvThKVXLpFXhvJPbq2IBTd3Wm12dL2vAW7/82zG63KYZk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDVN13dHSxa36Blsqt/Q8OyOA04CC7ZlvrS6zWL4aDyE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_iscsi_iqn": "", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_lsb": {}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 52586 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 52586 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_local": {}, "ansible_loadavg": {"1m": 0.45654296875, "5m": 0.32373046875, "15m": 0.15966796875}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "32", "second": "26", "epoch": "1726882346", "epoch_int": "1726882346", "date": "2024-09-20", "time": "21:32:26", "iso8601_micro": "2024-09-21T01:32:26.142923Z", "iso8601": "2024-09-21T01:32:26Z", "iso8601_basic": "20240920T213226142923", "iso8601_basic_short": "20240920T213226", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fips": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_fibre_channel_wwn": [], "ansible_is_chroot": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2962, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 569, "free": 2962}, "nocache": {"free": 3300, "used": 231}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_uuid": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 779, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794799616, "block_size": 4096, "block_total": 65519099, "block_available": 63914746, "block_used": 1604353, "inode_total": 131070960, "inode_available": 131029046, "inode_used": 41914, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_apparmor": {"status": "disabled"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_interfaces": ["lsr27", "eth0", "lo", "peerlsr27"], "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "c2:5a:92:97:66:08", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::c05a:92ff:fe97:6608", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1030:bff:fea1:4223", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lsr27": {"device": "lsr27", "macaddress": "7a:fe:b4:01:4b:ee", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv4": {"address": "192.0.2.1", "broadcast": "192.0.2.255", "netmask": "255.255.255.0", "network": "192.0.2.0", "prefix": "24"}, "ipv6": [{"address": "fe80::78fe:b4ff:fe01:4bee", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159", "192.0.2.1"], "ansible_all_ipv6_addresses": ["fe80::c05a:92ff:fe97:6608", "fe80::1030:bff:fea1:4223", "fe80::78fe:b4ff:fe01:4bee"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1", "192.0.2.1"], "ipv6": ["::1", "fe80::1030:bff:fea1:4223", "fe80::78fe:b4ff:fe01:4bee", "fe80::c05a:92ff:fe97:6608"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 18699 1726882346.55265: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882345.7085614-19645-170801521239259/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18699 1726882346.55268: _low_level_execute_command(): starting 18699 1726882346.55271: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882345.7085614-19645-170801521239259/ > /dev/null 2>&1 && sleep 0' 18699 1726882346.56228: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 18699 1726882346.56258: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882346.56332: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882346.56374: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882346.56444: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882346.58283: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882346.58298: stdout chunk (state=3): >>><<< 18699 1726882346.58313: stderr chunk (state=3): >>><<< 18699 1726882346.58499: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882346.58503: handler run complete 18699 1726882346.58509: variable 'ansible_facts' from source: unknown 18699 1726882346.58626: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882346.59057: variable 'ansible_facts' from source: unknown 18699 1726882346.59200: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882346.59368: attempt loop complete, returning result 18699 1726882346.59470: _execute() done 18699 1726882346.59478: dumping result to json 18699 1726882346.59521: done dumping result, returning 18699 1726882346.59534: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [12673a56-9f93-1ce6-d207-000000000316] 18699 1726882346.59542: sending task result for task 12673a56-9f93-1ce6-d207-000000000316 18699 1726882346.60182: done sending task result for task 12673a56-9f93-1ce6-d207-000000000316 18699 1726882346.60185: WORKER PROCESS EXITING ok: [managed_node1] 18699 1726882346.60847: no more pending results, returning what we have 18699 1726882346.60850: results queue empty 18699 1726882346.60852: checking for any_errors_fatal 18699 1726882346.60853: done checking for any_errors_fatal 18699 1726882346.60854: checking for max_fail_percentage 18699 1726882346.60856: done checking for max_fail_percentage 18699 1726882346.60857: checking to see if all hosts have failed and the running result is not ok 18699 1726882346.60858: done checking to see if all hosts have failed 18699 1726882346.60858: getting the remaining hosts for this loop 18699 1726882346.60860: done getting the remaining hosts for this loop 18699 1726882346.60863: getting the next task for host managed_node1 18699 1726882346.60868: done getting next task for host managed_node1 18699 1726882346.60870: ^ task is: TASK: meta (flush_handlers) 18699 1726882346.60872: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882346.60876: getting variables 18699 1726882346.60878: in VariableManager get_vars() 18699 1726882346.60929: Calling all_inventory to load vars for managed_node1 18699 1726882346.60958: Calling groups_inventory to load vars for managed_node1 18699 1726882346.60962: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882346.61004: Calling all_plugins_play to load vars for managed_node1 18699 1726882346.61007: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882346.61011: Calling groups_plugins_play to load vars for managed_node1 18699 1726882346.63352: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882346.67091: done with get_vars() 18699 1726882346.67191: done getting variables 18699 1726882346.67307: in VariableManager get_vars() 18699 1726882346.67320: Calling all_inventory to load vars for managed_node1 18699 1726882346.67322: Calling groups_inventory to load vars for managed_node1 18699 1726882346.67325: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882346.67330: Calling all_plugins_play to load vars for managed_node1 18699 1726882346.67333: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882346.67342: Calling groups_plugins_play to load vars for managed_node1 18699 1726882346.69079: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882346.70732: done with get_vars() 18699 1726882346.70763: done queuing things up, now waiting for results queue to drain 18699 1726882346.70765: results queue empty 18699 1726882346.70766: checking for any_errors_fatal 18699 1726882346.70770: done checking for any_errors_fatal 18699 1726882346.70771: checking for max_fail_percentage 18699 1726882346.70772: done checking for max_fail_percentage 18699 1726882346.70772: checking to see if all hosts have failed and the running result is not ok 18699 1726882346.70773: done checking to see if all hosts have failed 18699 1726882346.70779: getting the remaining hosts for this loop 18699 1726882346.70780: done getting the remaining hosts for this loop 18699 1726882346.70783: getting the next task for host managed_node1 18699 1726882346.70787: done getting next task for host managed_node1 18699 1726882346.70789: ^ task is: TASK: Show network_provider 18699 1726882346.70791: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882346.70797: getting variables 18699 1726882346.70799: in VariableManager get_vars() 18699 1726882346.70808: Calling all_inventory to load vars for managed_node1 18699 1726882346.70811: Calling groups_inventory to load vars for managed_node1 18699 1726882346.70813: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882346.70823: Calling all_plugins_play to load vars for managed_node1 18699 1726882346.70827: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882346.70830: Calling groups_plugins_play to load vars for managed_node1 18699 1726882346.72303: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882346.74907: done with get_vars() 18699 1726882346.74992: done getting variables 18699 1726882346.75205: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show network_provider] *************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:53 Friday 20 September 2024 21:32:26 -0400 (0:00:01.104) 0:00:20.348 ****** 18699 1726882346.75235: entering _queue_task() for managed_node1/debug 18699 1726882346.75838: worker is 1 (out of 1 available) 18699 1726882346.75849: exiting _queue_task() for managed_node1/debug 18699 1726882346.75861: done queuing things up, now waiting for results queue to drain 18699 1726882346.75861: waiting for pending results... 18699 1726882346.76231: running TaskExecutor() for managed_node1/TASK: Show network_provider 18699 1726882346.76428: in run() - task 12673a56-9f93-1ce6-d207-000000000033 18699 1726882346.76453: variable 'ansible_search_path' from source: unknown 18699 1726882346.76500: calling self._execute() 18699 1726882346.76787: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882346.76792: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882346.76800: variable 'omit' from source: magic vars 18699 1726882346.77570: variable 'ansible_distribution_major_version' from source: facts 18699 1726882346.77588: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882346.77604: variable 'omit' from source: magic vars 18699 1726882346.77641: variable 'omit' from source: magic vars 18699 1726882346.77680: variable 'omit' from source: magic vars 18699 1726882346.77726: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18699 1726882346.77771: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18699 1726882346.77802: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18699 1726882346.77826: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882346.77856: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882346.77885: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18699 1726882346.77965: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882346.77969: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882346.78015: Set connection var ansible_connection to ssh 18699 1726882346.78031: Set connection var ansible_pipelining to False 18699 1726882346.78042: Set connection var ansible_shell_executable to /bin/sh 18699 1726882346.78052: Set connection var ansible_timeout to 10 18699 1726882346.78058: Set connection var ansible_shell_type to sh 18699 1726882346.78068: Set connection var ansible_module_compression to ZIP_DEFLATED 18699 1726882346.78112: variable 'ansible_shell_executable' from source: unknown 18699 1726882346.78185: variable 'ansible_connection' from source: unknown 18699 1726882346.78189: variable 'ansible_module_compression' from source: unknown 18699 1726882346.78192: variable 'ansible_shell_type' from source: unknown 18699 1726882346.78199: variable 'ansible_shell_executable' from source: unknown 18699 1726882346.78201: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882346.78203: variable 'ansible_pipelining' from source: unknown 18699 1726882346.78204: variable 'ansible_timeout' from source: unknown 18699 1726882346.78206: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882346.78321: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18699 1726882346.78387: variable 'omit' from source: magic vars 18699 1726882346.78390: starting attempt loop 18699 1726882346.78397: running the handler 18699 1726882346.78440: variable 'network_provider' from source: set_fact 18699 1726882346.78528: variable 'network_provider' from source: set_fact 18699 1726882346.78619: handler run complete 18699 1726882346.78622: attempt loop complete, returning result 18699 1726882346.78625: _execute() done 18699 1726882346.78628: dumping result to json 18699 1726882346.78630: done dumping result, returning 18699 1726882346.78632: done running TaskExecutor() for managed_node1/TASK: Show network_provider [12673a56-9f93-1ce6-d207-000000000033] 18699 1726882346.78635: sending task result for task 12673a56-9f93-1ce6-d207-000000000033 18699 1726882346.78711: done sending task result for task 12673a56-9f93-1ce6-d207-000000000033 18699 1726882346.78713: WORKER PROCESS EXITING ok: [managed_node1] => { "network_provider": "nm" } 18699 1726882346.78759: no more pending results, returning what we have 18699 1726882346.78763: results queue empty 18699 1726882346.78764: checking for any_errors_fatal 18699 1726882346.78766: done checking for any_errors_fatal 18699 1726882346.78766: checking for max_fail_percentage 18699 1726882346.78769: done checking for max_fail_percentage 18699 1726882346.78770: checking to see if all hosts have failed and the running result is not ok 18699 1726882346.78771: done checking to see if all hosts have failed 18699 1726882346.78771: getting the remaining hosts for this loop 18699 1726882346.78773: done getting the remaining hosts for this loop 18699 1726882346.78777: getting the next task for host managed_node1 18699 1726882346.78784: done getting next task for host managed_node1 18699 1726882346.78786: ^ task is: TASK: meta (flush_handlers) 18699 1726882346.78787: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882346.78792: getting variables 18699 1726882346.78797: in VariableManager get_vars() 18699 1726882346.78827: Calling all_inventory to load vars for managed_node1 18699 1726882346.78830: Calling groups_inventory to load vars for managed_node1 18699 1726882346.78833: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882346.78845: Calling all_plugins_play to load vars for managed_node1 18699 1726882346.78848: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882346.78852: Calling groups_plugins_play to load vars for managed_node1 18699 1726882346.80882: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882346.82842: done with get_vars() 18699 1726882346.82875: done getting variables 18699 1726882346.83097: in VariableManager get_vars() 18699 1726882346.83108: Calling all_inventory to load vars for managed_node1 18699 1726882346.83110: Calling groups_inventory to load vars for managed_node1 18699 1726882346.83112: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882346.83117: Calling all_plugins_play to load vars for managed_node1 18699 1726882346.83119: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882346.83122: Calling groups_plugins_play to load vars for managed_node1 18699 1726882346.84877: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882346.86974: done with get_vars() 18699 1726882346.87015: done queuing things up, now waiting for results queue to drain 18699 1726882346.87017: results queue empty 18699 1726882346.87018: checking for any_errors_fatal 18699 1726882346.87021: done checking for any_errors_fatal 18699 1726882346.87022: checking for max_fail_percentage 18699 1726882346.87023: done checking for max_fail_percentage 18699 1726882346.87023: checking to see if all hosts have failed and the running result is not ok 18699 1726882346.87024: done checking to see if all hosts have failed 18699 1726882346.87025: getting the remaining hosts for this loop 18699 1726882346.87026: done getting the remaining hosts for this loop 18699 1726882346.87028: getting the next task for host managed_node1 18699 1726882346.87037: done getting next task for host managed_node1 18699 1726882346.87039: ^ task is: TASK: meta (flush_handlers) 18699 1726882346.87040: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882346.87043: getting variables 18699 1726882346.87043: in VariableManager get_vars() 18699 1726882346.87052: Calling all_inventory to load vars for managed_node1 18699 1726882346.87054: Calling groups_inventory to load vars for managed_node1 18699 1726882346.87056: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882346.87061: Calling all_plugins_play to load vars for managed_node1 18699 1726882346.87063: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882346.87066: Calling groups_plugins_play to load vars for managed_node1 18699 1726882346.88282: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882346.90519: done with get_vars() 18699 1726882346.90542: done getting variables 18699 1726882346.90610: in VariableManager get_vars() 18699 1726882346.90621: Calling all_inventory to load vars for managed_node1 18699 1726882346.90624: Calling groups_inventory to load vars for managed_node1 18699 1726882346.90708: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882346.90715: Calling all_plugins_play to load vars for managed_node1 18699 1726882346.90717: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882346.90720: Calling groups_plugins_play to load vars for managed_node1 18699 1726882346.92623: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882346.94885: done with get_vars() 18699 1726882346.94927: done queuing things up, now waiting for results queue to drain 18699 1726882346.94930: results queue empty 18699 1726882346.94931: checking for any_errors_fatal 18699 1726882346.94932: done checking for any_errors_fatal 18699 1726882346.94933: checking for max_fail_percentage 18699 1726882346.94957: done checking for max_fail_percentage 18699 1726882346.94958: checking to see if all hosts have failed and the running result is not ok 18699 1726882346.94959: done checking to see if all hosts have failed 18699 1726882346.94960: getting the remaining hosts for this loop 18699 1726882346.94961: done getting the remaining hosts for this loop 18699 1726882346.94964: getting the next task for host managed_node1 18699 1726882346.94967: done getting next task for host managed_node1 18699 1726882346.94968: ^ task is: None 18699 1726882346.94970: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882346.94971: done queuing things up, now waiting for results queue to drain 18699 1726882346.94972: results queue empty 18699 1726882346.94972: checking for any_errors_fatal 18699 1726882346.94973: done checking for any_errors_fatal 18699 1726882346.94974: checking for max_fail_percentage 18699 1726882346.94975: done checking for max_fail_percentage 18699 1726882346.94975: checking to see if all hosts have failed and the running result is not ok 18699 1726882346.94976: done checking to see if all hosts have failed 18699 1726882346.94977: getting the next task for host managed_node1 18699 1726882346.94980: done getting next task for host managed_node1 18699 1726882346.94980: ^ task is: None 18699 1726882346.94981: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882346.95019: in VariableManager get_vars() 18699 1726882346.95051: done with get_vars() 18699 1726882346.95066: in VariableManager get_vars() 18699 1726882346.95079: done with get_vars() 18699 1726882346.95084: variable 'omit' from source: magic vars 18699 1726882346.95271: variable 'profile' from source: play vars 18699 1726882346.95397: in VariableManager get_vars() 18699 1726882346.95411: done with get_vars() 18699 1726882346.95432: variable 'omit' from source: magic vars 18699 1726882346.95504: variable 'profile' from source: play vars PLAY [Set down {{ profile }}] ************************************************** 18699 1726882346.96620: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 18699 1726882346.96644: getting the remaining hosts for this loop 18699 1726882346.96646: done getting the remaining hosts for this loop 18699 1726882346.96648: getting the next task for host managed_node1 18699 1726882346.96675: done getting next task for host managed_node1 18699 1726882346.96677: ^ task is: TASK: Gathering Facts 18699 1726882346.96679: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882346.96681: getting variables 18699 1726882346.96683: in VariableManager get_vars() 18699 1726882346.96734: Calling all_inventory to load vars for managed_node1 18699 1726882346.96737: Calling groups_inventory to load vars for managed_node1 18699 1726882346.96739: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882346.96745: Calling all_plugins_play to load vars for managed_node1 18699 1726882346.96747: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882346.96750: Calling groups_plugins_play to load vars for managed_node1 18699 1726882346.98309: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882347.00748: done with get_vars() 18699 1726882347.00777: done getting variables 18699 1726882347.00835: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 Friday 20 September 2024 21:32:27 -0400 (0:00:00.256) 0:00:20.604 ****** 18699 1726882347.00861: entering _queue_task() for managed_node1/gather_facts 18699 1726882347.01343: worker is 1 (out of 1 available) 18699 1726882347.01383: exiting _queue_task() for managed_node1/gather_facts 18699 1726882347.01399: done queuing things up, now waiting for results queue to drain 18699 1726882347.01400: waiting for pending results... 18699 1726882347.01725: running TaskExecutor() for managed_node1/TASK: Gathering Facts 18699 1726882347.01820: in run() - task 12673a56-9f93-1ce6-d207-00000000032b 18699 1726882347.01843: variable 'ansible_search_path' from source: unknown 18699 1726882347.01885: calling self._execute() 18699 1726882347.02038: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882347.02042: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882347.02045: variable 'omit' from source: magic vars 18699 1726882347.02499: variable 'ansible_distribution_major_version' from source: facts 18699 1726882347.02518: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882347.02583: variable 'omit' from source: magic vars 18699 1726882347.02587: variable 'omit' from source: magic vars 18699 1726882347.02609: variable 'omit' from source: magic vars 18699 1726882347.02659: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18699 1726882347.02718: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18699 1726882347.02753: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18699 1726882347.02784: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882347.02810: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882347.02913: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18699 1726882347.02917: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882347.02919: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882347.02998: Set connection var ansible_connection to ssh 18699 1726882347.03021: Set connection var ansible_pipelining to False 18699 1726882347.03039: Set connection var ansible_shell_executable to /bin/sh 18699 1726882347.03051: Set connection var ansible_timeout to 10 18699 1726882347.03062: Set connection var ansible_shell_type to sh 18699 1726882347.03128: Set connection var ansible_module_compression to ZIP_DEFLATED 18699 1726882347.03131: variable 'ansible_shell_executable' from source: unknown 18699 1726882347.03133: variable 'ansible_connection' from source: unknown 18699 1726882347.03143: variable 'ansible_module_compression' from source: unknown 18699 1726882347.03151: variable 'ansible_shell_type' from source: unknown 18699 1726882347.03158: variable 'ansible_shell_executable' from source: unknown 18699 1726882347.03164: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882347.03171: variable 'ansible_pipelining' from source: unknown 18699 1726882347.03177: variable 'ansible_timeout' from source: unknown 18699 1726882347.03184: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882347.03420: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18699 1726882347.03437: variable 'omit' from source: magic vars 18699 1726882347.03455: starting attempt loop 18699 1726882347.03566: running the handler 18699 1726882347.03572: variable 'ansible_facts' from source: unknown 18699 1726882347.03576: _low_level_execute_command(): starting 18699 1726882347.03578: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18699 1726882347.04380: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882347.04474: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882347.04519: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882347.04538: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882347.04572: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882347.04650: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882347.06543: stdout chunk (state=3): >>>/root <<< 18699 1726882347.06636: stdout chunk (state=3): >>><<< 18699 1726882347.06640: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882347.06644: stderr chunk (state=3): >>><<< 18699 1726882347.06703: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882347.06805: _low_level_execute_command(): starting 18699 1726882347.06809: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882347.0668619-19710-49274016062031 `" && echo ansible-tmp-1726882347.0668619-19710-49274016062031="` echo /root/.ansible/tmp/ansible-tmp-1726882347.0668619-19710-49274016062031 `" ) && sleep 0' 18699 1726882347.07428: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882347.07450: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882347.07466: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882347.07585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882347.07611: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882347.07807: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882347.09651: stdout chunk (state=3): >>>ansible-tmp-1726882347.0668619-19710-49274016062031=/root/.ansible/tmp/ansible-tmp-1726882347.0668619-19710-49274016062031 <<< 18699 1726882347.09810: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882347.09822: stdout chunk (state=3): >>><<< 18699 1726882347.09851: stderr chunk (state=3): >>><<< 18699 1726882347.09881: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882347.0668619-19710-49274016062031=/root/.ansible/tmp/ansible-tmp-1726882347.0668619-19710-49274016062031 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882347.09932: variable 'ansible_module_compression' from source: unknown 18699 1726882347.10013: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18699f6i6z5dg/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 18699 1726882347.10090: variable 'ansible_facts' from source: unknown 18699 1726882347.10401: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882347.0668619-19710-49274016062031/AnsiballZ_setup.py 18699 1726882347.10566: Sending initial data 18699 1726882347.10582: Sent initial data (153 bytes) 18699 1726882347.11411: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882347.11436: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882347.11510: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882347.13010: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18699 1726882347.13087: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18699 1726882347.13150: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmp_qwewf_k /root/.ansible/tmp/ansible-tmp-1726882347.0668619-19710-49274016062031/AnsiballZ_setup.py <<< 18699 1726882347.13157: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882347.0668619-19710-49274016062031/AnsiballZ_setup.py" <<< 18699 1726882347.13252: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmp_qwewf_k" to remote "/root/.ansible/tmp/ansible-tmp-1726882347.0668619-19710-49274016062031/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882347.0668619-19710-49274016062031/AnsiballZ_setup.py" <<< 18699 1726882347.14892: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882347.14898: stdout chunk (state=3): >>><<< 18699 1726882347.14900: stderr chunk (state=3): >>><<< 18699 1726882347.14921: done transferring module to remote 18699 1726882347.14937: _low_level_execute_command(): starting 18699 1726882347.15000: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882347.0668619-19710-49274016062031/ /root/.ansible/tmp/ansible-tmp-1726882347.0668619-19710-49274016062031/AnsiballZ_setup.py && sleep 0' 18699 1726882347.15622: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882347.15634: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882347.15658: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882347.15761: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882347.15785: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882347.15802: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882347.15878: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882347.17714: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882347.17730: stderr chunk (state=3): >>><<< 18699 1726882347.17739: stdout chunk (state=3): >>><<< 18699 1726882347.17761: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882347.17855: _low_level_execute_command(): starting 18699 1726882347.17859: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882347.0668619-19710-49274016062031/AnsiballZ_setup.py && sleep 0' 18699 1726882347.18437: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882347.18453: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882347.18469: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882347.18486: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18699 1726882347.18510: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 18699 1726882347.18604: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882347.18631: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882347.18714: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882347.84977: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-159", "ansible_nodename": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d2d02cced42c36436217cb93f6b8e", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC9sgyYGKGPd0JFIDKIZZNkcX78Ca8OmX4GnOCt150Ftpgzzfir9Dy2HOb7d6QbQheoi9HLkHb66U2LDdt7EnBGKnI12YAuydTDfqITc2L4W9cEeoy/f2rrMlBo6FN3SNQc2voCDsWius2gK2mtTTZZI0R33PguMmqTkwYVzP0hYplwSYh5Atl+XP7/xLRhhowanh9U6x2ahqfnNq5DInqi070bKk0xZ2g12Vg8kIRno8ZQmm+ujUUevRkZysHvnrnN01ZQhqzjo/Awn+Pft6LYleTBn+YU/HlPMWR4PsFcrtT3WRdF5samSvVwWuuOC+0td2zQN4nGpYLK+FmpNG4nDfGZV/xIBBblNRvzrhKgk3lDU5qkeQ/R0godRQGbv4J1kq+3WU2E3upqBYxXWUJLM5FirAxz8tKLmaPh8YZWMKcs3X9F2ySLEcnhe5R5F6LFSNx13zQSt7lGZOIgzhvWllcs4YVxcV1Y4rTJ8jEK2KgWua+bZinJPYUJqKTzO2E=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKk0X8hfHP7BSAAI8BDwrr4175ddN6MsanEqlp3oVMOvThKVXLpFXhvJPbq2IBTd3Wm12dL2vAW7/82zG63KYZk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDVN13dHSxa36Blsqt/Q8OyOA04CC7ZlvrS6zWL4aDyE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_is_chroot": false, "ansible_loadavg": {"1m": 0.419921875, "5m": 0.31787109375, "15m": 0.15869140625}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fips": false, "ansible_apparmor": {"status": "disabled"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "32", "second": "27", "epoch": "1726882347", "epoch_int": "1726882347", "date": "2024-09-20", "time": "21:32:27", "iso8601_micro": "2024-09-21T01:32:27.460703Z", "iso8601": "2024-09-21T01:32:27Z", "iso8601_basic": "20240920T213227460703", "iso8601_basic_short": "20240920T213227", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_iscsi_iqn": "", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fibre_channel_wwn": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2965, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 566, "free": 2965}, "nocache": {"free": 3303, "used": 228}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_uuid": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 780, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794799616, "block_size": 4096, "block_total": 65519099, "block_available": 63914746, "block_used": 1604353, "inode_total": 131070960, "inode_available": 131029046, "inode_used": 41914, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_interfaces": ["peerlsr27", "lsr27", "lo", "eth0"], "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "c2:5a:92:97:66:08", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::c05a:92ff:fe97:6608", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1030:bff:fea1:4223", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lsr27": {"device": "lsr27", "macaddress": "7a:fe:b4:01:4b:ee", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv4": {"address": "192.0.2.1", "broadcast": "192.0.2.255", "netmask": "255.255.255.0", "network": "192.0.2.0", "prefix": "24"}, "ipv6": [{"address": "fe80::78fe:b4ff:fe01:4bee", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on",<<< 18699 1726882347.85015: stdout chunk (state=3): >>> "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159", "192.0.2.1"], "ansible_all_ipv6_addresses": ["fe80::c05a:92ff:fe97:6608", "fe80::1030:bff:fea1:4223", "fe80::78fe:b4ff:fe01:4bee"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1", "192.0.2.1"], "ipv6": ["::1", "fe80::1030:bff:fea1:4223", "fe80::78fe:b4ff:fe01:4bee", "fe80::c05a:92ff:fe97:6608"]}, "ansible_lsb": {}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 52586 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 52586 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_local": {}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 18699 1726882347.86958: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 18699 1726882347.86986: stdout chunk (state=3): >>><<< 18699 1726882347.86990: stderr chunk (state=3): >>><<< 18699 1726882347.87200: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-159", "ansible_nodename": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d2d02cced42c36436217cb93f6b8e", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC9sgyYGKGPd0JFIDKIZZNkcX78Ca8OmX4GnOCt150Ftpgzzfir9Dy2HOb7d6QbQheoi9HLkHb66U2LDdt7EnBGKnI12YAuydTDfqITc2L4W9cEeoy/f2rrMlBo6FN3SNQc2voCDsWius2gK2mtTTZZI0R33PguMmqTkwYVzP0hYplwSYh5Atl+XP7/xLRhhowanh9U6x2ahqfnNq5DInqi070bKk0xZ2g12Vg8kIRno8ZQmm+ujUUevRkZysHvnrnN01ZQhqzjo/Awn+Pft6LYleTBn+YU/HlPMWR4PsFcrtT3WRdF5samSvVwWuuOC+0td2zQN4nGpYLK+FmpNG4nDfGZV/xIBBblNRvzrhKgk3lDU5qkeQ/R0godRQGbv4J1kq+3WU2E3upqBYxXWUJLM5FirAxz8tKLmaPh8YZWMKcs3X9F2ySLEcnhe5R5F6LFSNx13zQSt7lGZOIgzhvWllcs4YVxcV1Y4rTJ8jEK2KgWua+bZinJPYUJqKTzO2E=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKk0X8hfHP7BSAAI8BDwrr4175ddN6MsanEqlp3oVMOvThKVXLpFXhvJPbq2IBTd3Wm12dL2vAW7/82zG63KYZk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDVN13dHSxa36Blsqt/Q8OyOA04CC7ZlvrS6zWL4aDyE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_is_chroot": false, "ansible_loadavg": {"1m": 0.419921875, "5m": 0.31787109375, "15m": 0.15869140625}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fips": false, "ansible_apparmor": {"status": "disabled"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "32", "second": "27", "epoch": "1726882347", "epoch_int": "1726882347", "date": "2024-09-20", "time": "21:32:27", "iso8601_micro": "2024-09-21T01:32:27.460703Z", "iso8601": "2024-09-21T01:32:27Z", "iso8601_basic": "20240920T213227460703", "iso8601_basic_short": "20240920T213227", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_iscsi_iqn": "", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fibre_channel_wwn": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2965, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 566, "free": 2965}, "nocache": {"free": 3303, "used": 228}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_uuid": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 780, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794799616, "block_size": 4096, "block_total": 65519099, "block_available": 63914746, "block_used": 1604353, "inode_total": 131070960, "inode_available": 131029046, "inode_used": 41914, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_interfaces": ["peerlsr27", "lsr27", "lo", "eth0"], "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "c2:5a:92:97:66:08", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::c05a:92ff:fe97:6608", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1030:bff:fea1:4223", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lsr27": {"device": "lsr27", "macaddress": "7a:fe:b4:01:4b:ee", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv4": {"address": "192.0.2.1", "broadcast": "192.0.2.255", "netmask": "255.255.255.0", "network": "192.0.2.0", "prefix": "24"}, "ipv6": [{"address": "fe80::78fe:b4ff:fe01:4bee", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159", "192.0.2.1"], "ansible_all_ipv6_addresses": ["fe80::c05a:92ff:fe97:6608", "fe80::1030:bff:fea1:4223", "fe80::78fe:b4ff:fe01:4bee"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1", "192.0.2.1"], "ipv6": ["::1", "fe80::1030:bff:fea1:4223", "fe80::78fe:b4ff:fe01:4bee", "fe80::c05a:92ff:fe97:6608"]}, "ansible_lsb": {}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 52586 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 52586 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_local": {}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 18699 1726882347.87543: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882347.0668619-19710-49274016062031/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18699 1726882347.87569: _low_level_execute_command(): starting 18699 1726882347.87578: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882347.0668619-19710-49274016062031/ > /dev/null 2>&1 && sleep 0' 18699 1726882347.88192: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882347.88216: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882347.88232: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882347.88313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882347.88347: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882347.88364: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882347.88390: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882347.88476: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882347.90404: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882347.90700: stdout chunk (state=3): >>><<< 18699 1726882347.90704: stderr chunk (state=3): >>><<< 18699 1726882347.90706: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882347.90709: handler run complete 18699 1726882347.90878: variable 'ansible_facts' from source: unknown 18699 1726882347.91023: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882347.91415: variable 'ansible_facts' from source: unknown 18699 1726882347.91545: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882347.91707: attempt loop complete, returning result 18699 1726882347.91717: _execute() done 18699 1726882347.91723: dumping result to json 18699 1726882347.91770: done dumping result, returning 18699 1726882347.91783: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [12673a56-9f93-1ce6-d207-00000000032b] 18699 1726882347.91791: sending task result for task 12673a56-9f93-1ce6-d207-00000000032b ok: [managed_node1] 18699 1726882347.93090: no more pending results, returning what we have 18699 1726882347.93096: results queue empty 18699 1726882347.93097: checking for any_errors_fatal 18699 1726882347.93098: done checking for any_errors_fatal 18699 1726882347.93099: checking for max_fail_percentage 18699 1726882347.93101: done checking for max_fail_percentage 18699 1726882347.93102: checking to see if all hosts have failed and the running result is not ok 18699 1726882347.93103: done checking to see if all hosts have failed 18699 1726882347.93103: getting the remaining hosts for this loop 18699 1726882347.93104: done getting the remaining hosts for this loop 18699 1726882347.93108: getting the next task for host managed_node1 18699 1726882347.93113: done getting next task for host managed_node1 18699 1726882347.93115: ^ task is: TASK: meta (flush_handlers) 18699 1726882347.93116: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882347.93121: getting variables 18699 1726882347.93122: in VariableManager get_vars() 18699 1726882347.93151: Calling all_inventory to load vars for managed_node1 18699 1726882347.93154: Calling groups_inventory to load vars for managed_node1 18699 1726882347.93156: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882347.93172: done sending task result for task 12673a56-9f93-1ce6-d207-00000000032b 18699 1726882347.93175: WORKER PROCESS EXITING 18699 1726882347.93186: Calling all_plugins_play to load vars for managed_node1 18699 1726882347.93189: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882347.93194: Calling groups_plugins_play to load vars for managed_node1 18699 1726882347.94538: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882347.96978: done with get_vars() 18699 1726882347.97017: done getting variables 18699 1726882347.97086: in VariableManager get_vars() 18699 1726882347.97105: Calling all_inventory to load vars for managed_node1 18699 1726882347.97107: Calling groups_inventory to load vars for managed_node1 18699 1726882347.97109: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882347.97119: Calling all_plugins_play to load vars for managed_node1 18699 1726882347.97122: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882347.97124: Calling groups_plugins_play to load vars for managed_node1 18699 1726882347.98562: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882348.00956: done with get_vars() 18699 1726882348.00997: done queuing things up, now waiting for results queue to drain 18699 1726882348.01000: results queue empty 18699 1726882348.01001: checking for any_errors_fatal 18699 1726882348.01005: done checking for any_errors_fatal 18699 1726882348.01006: checking for max_fail_percentage 18699 1726882348.01007: done checking for max_fail_percentage 18699 1726882348.01013: checking to see if all hosts have failed and the running result is not ok 18699 1726882348.01014: done checking to see if all hosts have failed 18699 1726882348.01015: getting the remaining hosts for this loop 18699 1726882348.01016: done getting the remaining hosts for this loop 18699 1726882348.01019: getting the next task for host managed_node1 18699 1726882348.01114: done getting next task for host managed_node1 18699 1726882348.01119: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 18699 1726882348.01121: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882348.01132: getting variables 18699 1726882348.01133: in VariableManager get_vars() 18699 1726882348.01189: Calling all_inventory to load vars for managed_node1 18699 1726882348.01192: Calling groups_inventory to load vars for managed_node1 18699 1726882348.01204: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882348.01210: Calling all_plugins_play to load vars for managed_node1 18699 1726882348.01213: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882348.01216: Calling groups_plugins_play to load vars for managed_node1 18699 1726882348.02928: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882348.05531: done with get_vars() 18699 1726882348.05554: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:32:28 -0400 (0:00:01.048) 0:00:21.652 ****** 18699 1726882348.05673: entering _queue_task() for managed_node1/include_tasks 18699 1726882348.06686: worker is 1 (out of 1 available) 18699 1726882348.06848: exiting _queue_task() for managed_node1/include_tasks 18699 1726882348.06858: done queuing things up, now waiting for results queue to drain 18699 1726882348.06859: waiting for pending results... 18699 1726882348.07031: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 18699 1726882348.07199: in run() - task 12673a56-9f93-1ce6-d207-00000000003c 18699 1726882348.07316: variable 'ansible_search_path' from source: unknown 18699 1726882348.07320: variable 'ansible_search_path' from source: unknown 18699 1726882348.07409: calling self._execute() 18699 1726882348.07533: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882348.07536: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882348.07546: variable 'omit' from source: magic vars 18699 1726882348.08075: variable 'ansible_distribution_major_version' from source: facts 18699 1726882348.08078: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882348.08081: _execute() done 18699 1726882348.08087: dumping result to json 18699 1726882348.08099: done dumping result, returning 18699 1726882348.08142: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12673a56-9f93-1ce6-d207-00000000003c] 18699 1726882348.08145: sending task result for task 12673a56-9f93-1ce6-d207-00000000003c 18699 1726882348.08300: done sending task result for task 12673a56-9f93-1ce6-d207-00000000003c 18699 1726882348.08340: no more pending results, returning what we have 18699 1726882348.08344: in VariableManager get_vars() 18699 1726882348.08385: Calling all_inventory to load vars for managed_node1 18699 1726882348.08388: Calling groups_inventory to load vars for managed_node1 18699 1726882348.08390: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882348.08406: Calling all_plugins_play to load vars for managed_node1 18699 1726882348.08409: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882348.08413: Calling groups_plugins_play to load vars for managed_node1 18699 1726882348.09007: WORKER PROCESS EXITING 18699 1726882348.10028: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882348.12026: done with get_vars() 18699 1726882348.12053: variable 'ansible_search_path' from source: unknown 18699 1726882348.12055: variable 'ansible_search_path' from source: unknown 18699 1726882348.12092: we have included files to process 18699 1726882348.12118: generating all_blocks data 18699 1726882348.12120: done generating all_blocks data 18699 1726882348.12121: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 18699 1726882348.12122: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 18699 1726882348.12125: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 18699 1726882348.13013: done processing included file 18699 1726882348.13016: iterating over new_blocks loaded from include file 18699 1726882348.13017: in VariableManager get_vars() 18699 1726882348.13103: done with get_vars() 18699 1726882348.13105: filtering new block on tags 18699 1726882348.13195: done filtering new block on tags 18699 1726882348.13199: in VariableManager get_vars() 18699 1726882348.13218: done with get_vars() 18699 1726882348.13219: filtering new block on tags 18699 1726882348.13243: done filtering new block on tags 18699 1726882348.13249: in VariableManager get_vars() 18699 1726882348.13270: done with get_vars() 18699 1726882348.13272: filtering new block on tags 18699 1726882348.13288: done filtering new block on tags 18699 1726882348.13291: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node1 18699 1726882348.13319: extending task lists for all hosts with included blocks 18699 1726882348.13887: done extending task lists 18699 1726882348.13888: done processing included files 18699 1726882348.13889: results queue empty 18699 1726882348.13890: checking for any_errors_fatal 18699 1726882348.13891: done checking for any_errors_fatal 18699 1726882348.13892: checking for max_fail_percentage 18699 1726882348.13925: done checking for max_fail_percentage 18699 1726882348.13926: checking to see if all hosts have failed and the running result is not ok 18699 1726882348.13927: done checking to see if all hosts have failed 18699 1726882348.13928: getting the remaining hosts for this loop 18699 1726882348.13929: done getting the remaining hosts for this loop 18699 1726882348.13932: getting the next task for host managed_node1 18699 1726882348.13936: done getting next task for host managed_node1 18699 1726882348.13939: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 18699 1726882348.13941: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882348.13951: getting variables 18699 1726882348.13952: in VariableManager get_vars() 18699 1726882348.13966: Calling all_inventory to load vars for managed_node1 18699 1726882348.13969: Calling groups_inventory to load vars for managed_node1 18699 1726882348.13989: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882348.13997: Calling all_plugins_play to load vars for managed_node1 18699 1726882348.14000: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882348.14003: Calling groups_plugins_play to load vars for managed_node1 18699 1726882348.23003: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882348.25035: done with get_vars() 18699 1726882348.25077: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:32:28 -0400 (0:00:00.195) 0:00:21.848 ****** 18699 1726882348.25218: entering _queue_task() for managed_node1/setup 18699 1726882348.25850: worker is 1 (out of 1 available) 18699 1726882348.25864: exiting _queue_task() for managed_node1/setup 18699 1726882348.25874: done queuing things up, now waiting for results queue to drain 18699 1726882348.25875: waiting for pending results... 18699 1726882348.26191: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 18699 1726882348.26576: in run() - task 12673a56-9f93-1ce6-d207-00000000036c 18699 1726882348.26649: variable 'ansible_search_path' from source: unknown 18699 1726882348.26653: variable 'ansible_search_path' from source: unknown 18699 1726882348.26719: calling self._execute() 18699 1726882348.26909: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882348.26913: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882348.26916: variable 'omit' from source: magic vars 18699 1726882348.27388: variable 'ansible_distribution_major_version' from source: facts 18699 1726882348.27439: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882348.27907: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18699 1726882348.30602: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18699 1726882348.30719: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18699 1726882348.30760: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18699 1726882348.30836: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18699 1726882348.30894: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18699 1726882348.30982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882348.31024: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882348.31108: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882348.31112: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882348.31155: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882348.31225: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882348.31255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882348.31284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882348.31401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882348.31405: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882348.31621: variable '__network_required_facts' from source: role '' defaults 18699 1726882348.31698: variable 'ansible_facts' from source: unknown 18699 1726882348.33230: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 18699 1726882348.33272: when evaluation is False, skipping this task 18699 1726882348.33321: _execute() done 18699 1726882348.33325: dumping result to json 18699 1726882348.33327: done dumping result, returning 18699 1726882348.33371: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12673a56-9f93-1ce6-d207-00000000036c] 18699 1726882348.33375: sending task result for task 12673a56-9f93-1ce6-d207-00000000036c skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18699 1726882348.33683: no more pending results, returning what we have 18699 1726882348.33689: results queue empty 18699 1726882348.33690: checking for any_errors_fatal 18699 1726882348.33692: done checking for any_errors_fatal 18699 1726882348.33695: checking for max_fail_percentage 18699 1726882348.33697: done checking for max_fail_percentage 18699 1726882348.33698: checking to see if all hosts have failed and the running result is not ok 18699 1726882348.33699: done checking to see if all hosts have failed 18699 1726882348.33699: getting the remaining hosts for this loop 18699 1726882348.33701: done getting the remaining hosts for this loop 18699 1726882348.33705: getting the next task for host managed_node1 18699 1726882348.33715: done getting next task for host managed_node1 18699 1726882348.33724: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 18699 1726882348.33727: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882348.33741: getting variables 18699 1726882348.33743: in VariableManager get_vars() 18699 1726882348.33790: Calling all_inventory to load vars for managed_node1 18699 1726882348.33860: Calling groups_inventory to load vars for managed_node1 18699 1726882348.33863: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882348.33875: Calling all_plugins_play to load vars for managed_node1 18699 1726882348.33878: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882348.33881: Calling groups_plugins_play to load vars for managed_node1 18699 1726882348.34480: done sending task result for task 12673a56-9f93-1ce6-d207-00000000036c 18699 1726882348.34484: WORKER PROCESS EXITING 18699 1726882348.36138: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882348.37722: done with get_vars() 18699 1726882348.37752: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:32:28 -0400 (0:00:00.126) 0:00:21.974 ****** 18699 1726882348.37856: entering _queue_task() for managed_node1/stat 18699 1726882348.38223: worker is 1 (out of 1 available) 18699 1726882348.38235: exiting _queue_task() for managed_node1/stat 18699 1726882348.38246: done queuing things up, now waiting for results queue to drain 18699 1726882348.38247: waiting for pending results... 18699 1726882348.38548: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 18699 1726882348.38681: in run() - task 12673a56-9f93-1ce6-d207-00000000036e 18699 1726882348.38706: variable 'ansible_search_path' from source: unknown 18699 1726882348.38720: variable 'ansible_search_path' from source: unknown 18699 1726882348.38761: calling self._execute() 18699 1726882348.39101: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882348.39209: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882348.39214: variable 'omit' from source: magic vars 18699 1726882348.39770: variable 'ansible_distribution_major_version' from source: facts 18699 1726882348.39871: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882348.40107: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18699 1726882348.40427: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18699 1726882348.40476: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18699 1726882348.40564: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18699 1726882348.40605: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18699 1726882348.40707: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18699 1726882348.40742: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18699 1726882348.40773: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882348.40809: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18699 1726882348.40916: variable '__network_is_ostree' from source: set_fact 18699 1726882348.40929: Evaluated conditional (not __network_is_ostree is defined): False 18699 1726882348.40936: when evaluation is False, skipping this task 18699 1726882348.40947: _execute() done 18699 1726882348.40954: dumping result to json 18699 1726882348.40963: done dumping result, returning 18699 1726882348.40976: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [12673a56-9f93-1ce6-d207-00000000036e] 18699 1726882348.40985: sending task result for task 12673a56-9f93-1ce6-d207-00000000036e 18699 1726882348.41201: done sending task result for task 12673a56-9f93-1ce6-d207-00000000036e 18699 1726882348.41204: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 18699 1726882348.41258: no more pending results, returning what we have 18699 1726882348.41262: results queue empty 18699 1726882348.41263: checking for any_errors_fatal 18699 1726882348.41268: done checking for any_errors_fatal 18699 1726882348.41269: checking for max_fail_percentage 18699 1726882348.41271: done checking for max_fail_percentage 18699 1726882348.41272: checking to see if all hosts have failed and the running result is not ok 18699 1726882348.41273: done checking to see if all hosts have failed 18699 1726882348.41274: getting the remaining hosts for this loop 18699 1726882348.41276: done getting the remaining hosts for this loop 18699 1726882348.41280: getting the next task for host managed_node1 18699 1726882348.41289: done getting next task for host managed_node1 18699 1726882348.41301: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 18699 1726882348.41304: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882348.41318: getting variables 18699 1726882348.41320: in VariableManager get_vars() 18699 1726882348.41360: Calling all_inventory to load vars for managed_node1 18699 1726882348.41363: Calling groups_inventory to load vars for managed_node1 18699 1726882348.41365: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882348.41375: Calling all_plugins_play to load vars for managed_node1 18699 1726882348.41378: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882348.41380: Calling groups_plugins_play to load vars for managed_node1 18699 1726882348.43085: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882348.44684: done with get_vars() 18699 1726882348.44724: done getting variables 18699 1726882348.44788: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:32:28 -0400 (0:00:00.069) 0:00:22.044 ****** 18699 1726882348.44832: entering _queue_task() for managed_node1/set_fact 18699 1726882348.45204: worker is 1 (out of 1 available) 18699 1726882348.45215: exiting _queue_task() for managed_node1/set_fact 18699 1726882348.45226: done queuing things up, now waiting for results queue to drain 18699 1726882348.45227: waiting for pending results... 18699 1726882348.45616: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 18699 1726882348.45650: in run() - task 12673a56-9f93-1ce6-d207-00000000036f 18699 1726882348.45668: variable 'ansible_search_path' from source: unknown 18699 1726882348.45674: variable 'ansible_search_path' from source: unknown 18699 1726882348.45718: calling self._execute() 18699 1726882348.45817: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882348.45832: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882348.45848: variable 'omit' from source: magic vars 18699 1726882348.46242: variable 'ansible_distribution_major_version' from source: facts 18699 1726882348.46265: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882348.46441: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18699 1726882348.46803: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18699 1726882348.46807: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18699 1726882348.46811: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18699 1726882348.46888: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18699 1726882348.46980: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18699 1726882348.47014: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18699 1726882348.47042: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882348.47066: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18699 1726882348.47166: variable '__network_is_ostree' from source: set_fact 18699 1726882348.47179: Evaluated conditional (not __network_is_ostree is defined): False 18699 1726882348.47187: when evaluation is False, skipping this task 18699 1726882348.47199: _execute() done 18699 1726882348.47208: dumping result to json 18699 1726882348.47215: done dumping result, returning 18699 1726882348.47226: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12673a56-9f93-1ce6-d207-00000000036f] 18699 1726882348.47237: sending task result for task 12673a56-9f93-1ce6-d207-00000000036f 18699 1726882348.47421: done sending task result for task 12673a56-9f93-1ce6-d207-00000000036f 18699 1726882348.47424: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 18699 1726882348.47492: no more pending results, returning what we have 18699 1726882348.47500: results queue empty 18699 1726882348.47501: checking for any_errors_fatal 18699 1726882348.47508: done checking for any_errors_fatal 18699 1726882348.47509: checking for max_fail_percentage 18699 1726882348.47511: done checking for max_fail_percentage 18699 1726882348.47512: checking to see if all hosts have failed and the running result is not ok 18699 1726882348.47513: done checking to see if all hosts have failed 18699 1726882348.47514: getting the remaining hosts for this loop 18699 1726882348.47515: done getting the remaining hosts for this loop 18699 1726882348.47520: getting the next task for host managed_node1 18699 1726882348.47529: done getting next task for host managed_node1 18699 1726882348.47533: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 18699 1726882348.47536: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882348.47550: getting variables 18699 1726882348.47552: in VariableManager get_vars() 18699 1726882348.47590: Calling all_inventory to load vars for managed_node1 18699 1726882348.47597: Calling groups_inventory to load vars for managed_node1 18699 1726882348.47600: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882348.47611: Calling all_plugins_play to load vars for managed_node1 18699 1726882348.47614: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882348.47617: Calling groups_plugins_play to load vars for managed_node1 18699 1726882348.49441: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882348.51296: done with get_vars() 18699 1726882348.51328: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:32:28 -0400 (0:00:00.065) 0:00:22.110 ****** 18699 1726882348.51443: entering _queue_task() for managed_node1/service_facts 18699 1726882348.51856: worker is 1 (out of 1 available) 18699 1726882348.51869: exiting _queue_task() for managed_node1/service_facts 18699 1726882348.51881: done queuing things up, now waiting for results queue to drain 18699 1726882348.51882: waiting for pending results... 18699 1726882348.52215: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running 18699 1726882348.52312: in run() - task 12673a56-9f93-1ce6-d207-000000000371 18699 1726882348.52336: variable 'ansible_search_path' from source: unknown 18699 1726882348.52343: variable 'ansible_search_path' from source: unknown 18699 1726882348.52386: calling self._execute() 18699 1726882348.52501: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882348.52504: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882348.52507: variable 'omit' from source: magic vars 18699 1726882348.53507: variable 'ansible_distribution_major_version' from source: facts 18699 1726882348.53703: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882348.53707: variable 'omit' from source: magic vars 18699 1726882348.53710: variable 'omit' from source: magic vars 18699 1726882348.53713: variable 'omit' from source: magic vars 18699 1726882348.53716: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18699 1726882348.53928: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18699 1726882348.53954: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18699 1726882348.53977: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882348.53992: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882348.54035: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18699 1726882348.54301: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882348.54306: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882348.54323: Set connection var ansible_connection to ssh 18699 1726882348.54337: Set connection var ansible_pipelining to False 18699 1726882348.54348: Set connection var ansible_shell_executable to /bin/sh 18699 1726882348.54359: Set connection var ansible_timeout to 10 18699 1726882348.54367: Set connection var ansible_shell_type to sh 18699 1726882348.54377: Set connection var ansible_module_compression to ZIP_DEFLATED 18699 1726882348.54417: variable 'ansible_shell_executable' from source: unknown 18699 1726882348.54701: variable 'ansible_connection' from source: unknown 18699 1726882348.54706: variable 'ansible_module_compression' from source: unknown 18699 1726882348.54708: variable 'ansible_shell_type' from source: unknown 18699 1726882348.54711: variable 'ansible_shell_executable' from source: unknown 18699 1726882348.54713: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882348.54715: variable 'ansible_pipelining' from source: unknown 18699 1726882348.54717: variable 'ansible_timeout' from source: unknown 18699 1726882348.54719: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882348.54856: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18699 1726882348.55201: variable 'omit' from source: magic vars 18699 1726882348.55204: starting attempt loop 18699 1726882348.55207: running the handler 18699 1726882348.55210: _low_level_execute_command(): starting 18699 1726882348.55212: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18699 1726882348.56360: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882348.56381: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882348.56402: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882348.56421: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18699 1726882348.56438: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 18699 1726882348.56450: stderr chunk (state=3): >>>debug2: match not found <<< 18699 1726882348.56464: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882348.56482: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18699 1726882348.56574: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882348.56606: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882348.56627: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882348.56924: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882348.58613: stdout chunk (state=3): >>>/root <<< 18699 1726882348.58768: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882348.58772: stdout chunk (state=3): >>><<< 18699 1726882348.58774: stderr chunk (state=3): >>><<< 18699 1726882348.58799: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882348.58823: _low_level_execute_command(): starting 18699 1726882348.58835: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882348.58807-19771-266748164028581 `" && echo ansible-tmp-1726882348.58807-19771-266748164028581="` echo /root/.ansible/tmp/ansible-tmp-1726882348.58807-19771-266748164028581 `" ) && sleep 0' 18699 1726882348.59869: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882348.60018: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882348.60036: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882348.60064: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18699 1726882348.60174: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882348.60187: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882348.60212: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882348.60286: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882348.62203: stdout chunk (state=3): >>>ansible-tmp-1726882348.58807-19771-266748164028581=/root/.ansible/tmp/ansible-tmp-1726882348.58807-19771-266748164028581 <<< 18699 1726882348.62354: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882348.62365: stdout chunk (state=3): >>><<< 18699 1726882348.62382: stderr chunk (state=3): >>><<< 18699 1726882348.62410: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882348.58807-19771-266748164028581=/root/.ansible/tmp/ansible-tmp-1726882348.58807-19771-266748164028581 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882348.62468: variable 'ansible_module_compression' from source: unknown 18699 1726882348.62535: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18699f6i6z5dg/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 18699 1726882348.62581: variable 'ansible_facts' from source: unknown 18699 1726882348.62683: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882348.58807-19771-266748164028581/AnsiballZ_service_facts.py 18699 1726882348.62871: Sending initial data 18699 1726882348.62907: Sent initial data (160 bytes) 18699 1726882348.63635: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882348.63709: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882348.63753: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882348.63769: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882348.63783: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882348.63857: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882348.65403: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18699 1726882348.65458: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18699 1726882348.65508: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmp3rl351i4 /root/.ansible/tmp/ansible-tmp-1726882348.58807-19771-266748164028581/AnsiballZ_service_facts.py <<< 18699 1726882348.65512: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882348.58807-19771-266748164028581/AnsiballZ_service_facts.py" <<< 18699 1726882348.65556: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmp3rl351i4" to remote "/root/.ansible/tmp/ansible-tmp-1726882348.58807-19771-266748164028581/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882348.58807-19771-266748164028581/AnsiballZ_service_facts.py" <<< 18699 1726882348.67605: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882348.68067: stderr chunk (state=3): >>><<< 18699 1726882348.68071: stdout chunk (state=3): >>><<< 18699 1726882348.68074: done transferring module to remote 18699 1726882348.68076: _low_level_execute_command(): starting 18699 1726882348.68078: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882348.58807-19771-266748164028581/ /root/.ansible/tmp/ansible-tmp-1726882348.58807-19771-266748164028581/AnsiballZ_service_facts.py && sleep 0' 18699 1726882348.69161: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882348.69170: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882348.69180: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882348.69196: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18699 1726882348.69212: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 18699 1726882348.69490: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882348.69671: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882348.69932: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882348.71649: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882348.71750: stderr chunk (state=3): >>><<< 18699 1726882348.72004: stdout chunk (state=3): >>><<< 18699 1726882348.72012: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882348.72014: _low_level_execute_command(): starting 18699 1726882348.72017: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882348.58807-19771-266748164028581/AnsiballZ_service_facts.py && sleep 0' 18699 1726882348.73024: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882348.73290: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882348.73451: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882348.73540: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882350.24244: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 18699 1726882350.26204: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 18699 1726882350.26208: stdout chunk (state=3): >>><<< 18699 1726882350.26210: stderr chunk (state=3): >>><<< 18699 1726882350.26215: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 18699 1726882350.28136: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882348.58807-19771-266748164028581/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18699 1726882350.28153: _low_level_execute_command(): starting 18699 1726882350.28410: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882348.58807-19771-266748164028581/ > /dev/null 2>&1 && sleep 0' 18699 1726882350.29573: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882350.29611: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882350.29628: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882350.29646: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18699 1726882350.29662: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 18699 1726882350.29792: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882350.29941: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882350.30015: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882350.31953: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882350.31965: stdout chunk (state=3): >>><<< 18699 1726882350.31978: stderr chunk (state=3): >>><<< 18699 1726882350.32006: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882350.32301: handler run complete 18699 1726882350.32532: variable 'ansible_facts' from source: unknown 18699 1726882350.32890: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882350.33811: variable 'ansible_facts' from source: unknown 18699 1726882350.34101: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882350.34466: attempt loop complete, returning result 18699 1726882350.34597: _execute() done 18699 1726882350.34606: dumping result to json 18699 1726882350.34670: done dumping result, returning 18699 1726882350.34707: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running [12673a56-9f93-1ce6-d207-000000000371] 18699 1726882350.34717: sending task result for task 12673a56-9f93-1ce6-d207-000000000371 ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18699 1726882350.36508: no more pending results, returning what we have 18699 1726882350.36511: results queue empty 18699 1726882350.36512: checking for any_errors_fatal 18699 1726882350.36517: done checking for any_errors_fatal 18699 1726882350.36518: checking for max_fail_percentage 18699 1726882350.36519: done checking for max_fail_percentage 18699 1726882350.36520: checking to see if all hosts have failed and the running result is not ok 18699 1726882350.36521: done checking to see if all hosts have failed 18699 1726882350.36522: getting the remaining hosts for this loop 18699 1726882350.36523: done getting the remaining hosts for this loop 18699 1726882350.36527: getting the next task for host managed_node1 18699 1726882350.36534: done getting next task for host managed_node1 18699 1726882350.36538: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 18699 1726882350.36541: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882350.36551: getting variables 18699 1726882350.36554: in VariableManager get_vars() 18699 1726882350.36588: Calling all_inventory to load vars for managed_node1 18699 1726882350.36591: Calling groups_inventory to load vars for managed_node1 18699 1726882350.36801: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882350.36811: Calling all_plugins_play to load vars for managed_node1 18699 1726882350.36815: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882350.36817: Calling groups_plugins_play to load vars for managed_node1 18699 1726882350.37524: done sending task result for task 12673a56-9f93-1ce6-d207-000000000371 18699 1726882350.37527: WORKER PROCESS EXITING 18699 1726882350.39614: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882350.43602: done with get_vars() 18699 1726882350.43628: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:32:30 -0400 (0:00:01.924) 0:00:24.035 ****** 18699 1726882350.43926: entering _queue_task() for managed_node1/package_facts 18699 1726882350.44481: worker is 1 (out of 1 available) 18699 1726882350.44896: exiting _queue_task() for managed_node1/package_facts 18699 1726882350.44906: done queuing things up, now waiting for results queue to drain 18699 1726882350.44907: waiting for pending results... 18699 1726882350.45097: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 18699 1726882350.45362: in run() - task 12673a56-9f93-1ce6-d207-000000000372 18699 1726882350.45570: variable 'ansible_search_path' from source: unknown 18699 1726882350.45574: variable 'ansible_search_path' from source: unknown 18699 1726882350.45576: calling self._execute() 18699 1726882350.45843: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882350.45952: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882350.45956: variable 'omit' from source: magic vars 18699 1726882350.46472: variable 'ansible_distribution_major_version' from source: facts 18699 1726882350.46621: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882350.46633: variable 'omit' from source: magic vars 18699 1726882350.46697: variable 'omit' from source: magic vars 18699 1726882350.46748: variable 'omit' from source: magic vars 18699 1726882350.46846: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18699 1726882350.46967: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18699 1726882350.46995: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18699 1726882350.47061: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882350.47078: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882350.47116: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18699 1726882350.47205: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882350.47214: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882350.47435: Set connection var ansible_connection to ssh 18699 1726882350.47449: Set connection var ansible_pipelining to False 18699 1726882350.47461: Set connection var ansible_shell_executable to /bin/sh 18699 1726882350.47473: Set connection var ansible_timeout to 10 18699 1726882350.47484: Set connection var ansible_shell_type to sh 18699 1726882350.47496: Set connection var ansible_module_compression to ZIP_DEFLATED 18699 1726882350.47701: variable 'ansible_shell_executable' from source: unknown 18699 1726882350.47704: variable 'ansible_connection' from source: unknown 18699 1726882350.47707: variable 'ansible_module_compression' from source: unknown 18699 1726882350.47709: variable 'ansible_shell_type' from source: unknown 18699 1726882350.47712: variable 'ansible_shell_executable' from source: unknown 18699 1726882350.47714: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882350.47716: variable 'ansible_pipelining' from source: unknown 18699 1726882350.47718: variable 'ansible_timeout' from source: unknown 18699 1726882350.47720: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882350.48137: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18699 1726882350.48141: variable 'omit' from source: magic vars 18699 1726882350.48144: starting attempt loop 18699 1726882350.48146: running the handler 18699 1726882350.48149: _low_level_execute_command(): starting 18699 1726882350.48163: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18699 1726882350.49688: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882350.49788: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882350.49923: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882350.49957: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882350.50007: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882350.51661: stdout chunk (state=3): >>>/root <<< 18699 1726882350.51799: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882350.51812: stdout chunk (state=3): >>><<< 18699 1726882350.52028: stderr chunk (state=3): >>><<< 18699 1726882350.52031: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882350.52037: _low_level_execute_command(): starting 18699 1726882350.52039: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882350.5193431-19879-45806067121233 `" && echo ansible-tmp-1726882350.5193431-19879-45806067121233="` echo /root/.ansible/tmp/ansible-tmp-1726882350.5193431-19879-45806067121233 `" ) && sleep 0' 18699 1726882350.53032: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882350.53300: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882350.53443: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882350.53517: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882350.55388: stdout chunk (state=3): >>>ansible-tmp-1726882350.5193431-19879-45806067121233=/root/.ansible/tmp/ansible-tmp-1726882350.5193431-19879-45806067121233 <<< 18699 1726882350.55551: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882350.55561: stdout chunk (state=3): >>><<< 18699 1726882350.55571: stderr chunk (state=3): >>><<< 18699 1726882350.55596: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882350.5193431-19879-45806067121233=/root/.ansible/tmp/ansible-tmp-1726882350.5193431-19879-45806067121233 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882350.55822: variable 'ansible_module_compression' from source: unknown 18699 1726882350.55825: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18699f6i6z5dg/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 18699 1726882350.55949: variable 'ansible_facts' from source: unknown 18699 1726882350.56320: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882350.5193431-19879-45806067121233/AnsiballZ_package_facts.py 18699 1726882350.56723: Sending initial data 18699 1726882350.56726: Sent initial data (161 bytes) 18699 1726882350.57911: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882350.58047: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882350.58078: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882350.58224: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882350.59769: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18699 1726882350.59808: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18699 1726882350.59905: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmpxgap2gu8 /root/.ansible/tmp/ansible-tmp-1726882350.5193431-19879-45806067121233/AnsiballZ_package_facts.py <<< 18699 1726882350.59913: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882350.5193431-19879-45806067121233/AnsiballZ_package_facts.py" <<< 18699 1726882350.59972: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmpxgap2gu8" to remote "/root/.ansible/tmp/ansible-tmp-1726882350.5193431-19879-45806067121233/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882350.5193431-19879-45806067121233/AnsiballZ_package_facts.py" <<< 18699 1726882350.62737: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882350.62741: stderr chunk (state=3): >>><<< 18699 1726882350.62743: stdout chunk (state=3): >>><<< 18699 1726882350.62745: done transferring module to remote 18699 1726882350.62747: _low_level_execute_command(): starting 18699 1726882350.62750: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882350.5193431-19879-45806067121233/ /root/.ansible/tmp/ansible-tmp-1726882350.5193431-19879-45806067121233/AnsiballZ_package_facts.py && sleep 0' 18699 1726882350.64045: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882350.64113: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882350.64162: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882350.64380: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882350.64402: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882350.64426: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882350.64503: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882350.66307: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882350.66318: stdout chunk (state=3): >>><<< 18699 1726882350.66540: stderr chunk (state=3): >>><<< 18699 1726882350.66546: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882350.66554: _low_level_execute_command(): starting 18699 1726882350.66557: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882350.5193431-19879-45806067121233/AnsiballZ_package_facts.py && sleep 0' 18699 1726882350.68126: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882350.68130: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882350.68254: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882350.68424: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882351.11987: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 18699 1726882351.12123: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certm<<< 18699 1726882351.12133: stdout chunk (state=3): >>>ap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "sou<<< 18699 1726882351.12262: stdout chunk (state=3): >>>rce": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 18699 1726882351.13912: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 18699 1726882351.14200: stderr chunk (state=3): >>><<< 18699 1726882351.14205: stdout chunk (state=3): >>><<< 18699 1726882351.14407: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 18699 1726882351.18473: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882350.5193431-19879-45806067121233/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18699 1726882351.18489: _low_level_execute_command(): starting 18699 1726882351.18520: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882350.5193431-19879-45806067121233/ > /dev/null 2>&1 && sleep 0' 18699 1726882351.19495: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882351.19499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882351.19502: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882351.19504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 18699 1726882351.19512: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882351.19569: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882351.19573: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882351.19634: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882351.21507: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882351.21539: stdout chunk (state=3): >>><<< 18699 1726882351.21542: stderr chunk (state=3): >>><<< 18699 1726882351.21557: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882351.21567: handler run complete 18699 1726882351.22695: variable 'ansible_facts' from source: unknown 18699 1726882351.23388: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882351.25782: variable 'ansible_facts' from source: unknown 18699 1726882351.26323: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882351.27551: attempt loop complete, returning result 18699 1726882351.27563: _execute() done 18699 1726882351.27566: dumping result to json 18699 1726882351.27900: done dumping result, returning 18699 1726882351.27903: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [12673a56-9f93-1ce6-d207-000000000372] 18699 1726882351.27905: sending task result for task 12673a56-9f93-1ce6-d207-000000000372 18699 1726882351.30816: done sending task result for task 12673a56-9f93-1ce6-d207-000000000372 18699 1726882351.30819: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18699 1726882351.30972: no more pending results, returning what we have 18699 1726882351.30974: results queue empty 18699 1726882351.30975: checking for any_errors_fatal 18699 1726882351.30981: done checking for any_errors_fatal 18699 1726882351.30981: checking for max_fail_percentage 18699 1726882351.30983: done checking for max_fail_percentage 18699 1726882351.30984: checking to see if all hosts have failed and the running result is not ok 18699 1726882351.30984: done checking to see if all hosts have failed 18699 1726882351.30985: getting the remaining hosts for this loop 18699 1726882351.30986: done getting the remaining hosts for this loop 18699 1726882351.30989: getting the next task for host managed_node1 18699 1726882351.31000: done getting next task for host managed_node1 18699 1726882351.31004: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 18699 1726882351.31006: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882351.31018: getting variables 18699 1726882351.31019: in VariableManager get_vars() 18699 1726882351.31049: Calling all_inventory to load vars for managed_node1 18699 1726882351.31052: Calling groups_inventory to load vars for managed_node1 18699 1726882351.31057: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882351.31065: Calling all_plugins_play to load vars for managed_node1 18699 1726882351.31068: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882351.31071: Calling groups_plugins_play to load vars for managed_node1 18699 1726882351.32312: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882351.34439: done with get_vars() 18699 1726882351.34472: done getting variables 18699 1726882351.34540: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:32:31 -0400 (0:00:00.906) 0:00:24.941 ****** 18699 1726882351.34574: entering _queue_task() for managed_node1/debug 18699 1726882351.35548: worker is 1 (out of 1 available) 18699 1726882351.35563: exiting _queue_task() for managed_node1/debug 18699 1726882351.35575: done queuing things up, now waiting for results queue to drain 18699 1726882351.35576: waiting for pending results... 18699 1726882351.36312: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 18699 1726882351.36378: in run() - task 12673a56-9f93-1ce6-d207-00000000003d 18699 1726882351.36560: variable 'ansible_search_path' from source: unknown 18699 1726882351.36564: variable 'ansible_search_path' from source: unknown 18699 1726882351.36567: calling self._execute() 18699 1726882351.36896: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882351.36900: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882351.36902: variable 'omit' from source: magic vars 18699 1726882351.37773: variable 'ansible_distribution_major_version' from source: facts 18699 1726882351.37813: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882351.37824: variable 'omit' from source: magic vars 18699 1726882351.37896: variable 'omit' from source: magic vars 18699 1726882351.38197: variable 'network_provider' from source: set_fact 18699 1726882351.38222: variable 'omit' from source: magic vars 18699 1726882351.38265: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18699 1726882351.38411: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18699 1726882351.38442: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18699 1726882351.38535: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882351.38581: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882351.38651: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18699 1726882351.38683: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882351.38705: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882351.39048: Set connection var ansible_connection to ssh 18699 1726882351.39052: Set connection var ansible_pipelining to False 18699 1726882351.39054: Set connection var ansible_shell_executable to /bin/sh 18699 1726882351.39056: Set connection var ansible_timeout to 10 18699 1726882351.39058: Set connection var ansible_shell_type to sh 18699 1726882351.39060: Set connection var ansible_module_compression to ZIP_DEFLATED 18699 1726882351.39362: variable 'ansible_shell_executable' from source: unknown 18699 1726882351.39367: variable 'ansible_connection' from source: unknown 18699 1726882351.39370: variable 'ansible_module_compression' from source: unknown 18699 1726882351.39372: variable 'ansible_shell_type' from source: unknown 18699 1726882351.39374: variable 'ansible_shell_executable' from source: unknown 18699 1726882351.39376: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882351.39378: variable 'ansible_pipelining' from source: unknown 18699 1726882351.39379: variable 'ansible_timeout' from source: unknown 18699 1726882351.39382: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882351.40171: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18699 1726882351.40175: variable 'omit' from source: magic vars 18699 1726882351.40177: starting attempt loop 18699 1726882351.40179: running the handler 18699 1726882351.40181: handler run complete 18699 1726882351.40220: attempt loop complete, returning result 18699 1726882351.40224: _execute() done 18699 1726882351.40229: dumping result to json 18699 1726882351.40237: done dumping result, returning 18699 1726882351.40251: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [12673a56-9f93-1ce6-d207-00000000003d] 18699 1726882351.40319: sending task result for task 12673a56-9f93-1ce6-d207-00000000003d ok: [managed_node1] => {} MSG: Using network provider: nm 18699 1726882351.40798: no more pending results, returning what we have 18699 1726882351.40802: results queue empty 18699 1726882351.40803: checking for any_errors_fatal 18699 1726882351.40814: done checking for any_errors_fatal 18699 1726882351.40815: checking for max_fail_percentage 18699 1726882351.40817: done checking for max_fail_percentage 18699 1726882351.40818: checking to see if all hosts have failed and the running result is not ok 18699 1726882351.40819: done checking to see if all hosts have failed 18699 1726882351.40820: getting the remaining hosts for this loop 18699 1726882351.40822: done getting the remaining hosts for this loop 18699 1726882351.40826: getting the next task for host managed_node1 18699 1726882351.40833: done getting next task for host managed_node1 18699 1726882351.40838: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 18699 1726882351.40841: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882351.40851: getting variables 18699 1726882351.40853: in VariableManager get_vars() 18699 1726882351.40890: Calling all_inventory to load vars for managed_node1 18699 1726882351.40963: Calling groups_inventory to load vars for managed_node1 18699 1726882351.40968: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882351.40979: Calling all_plugins_play to load vars for managed_node1 18699 1726882351.40982: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882351.40985: Calling groups_plugins_play to load vars for managed_node1 18699 1726882351.41520: done sending task result for task 12673a56-9f93-1ce6-d207-00000000003d 18699 1726882351.41524: WORKER PROCESS EXITING 18699 1726882351.42840: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882351.47300: done with get_vars() 18699 1726882351.47335: done getting variables 18699 1726882351.47638: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:32:31 -0400 (0:00:00.131) 0:00:25.072 ****** 18699 1726882351.47680: entering _queue_task() for managed_node1/fail 18699 1726882351.48246: worker is 1 (out of 1 available) 18699 1726882351.48258: exiting _queue_task() for managed_node1/fail 18699 1726882351.48269: done queuing things up, now waiting for results queue to drain 18699 1726882351.48270: waiting for pending results... 18699 1726882351.48512: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 18699 1726882351.48682: in run() - task 12673a56-9f93-1ce6-d207-00000000003e 18699 1726882351.48687: variable 'ansible_search_path' from source: unknown 18699 1726882351.48690: variable 'ansible_search_path' from source: unknown 18699 1726882351.48772: calling self._execute() 18699 1726882351.48858: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882351.48869: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882351.48889: variable 'omit' from source: magic vars 18699 1726882351.49307: variable 'ansible_distribution_major_version' from source: facts 18699 1726882351.49368: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882351.49484: variable 'network_state' from source: role '' defaults 18699 1726882351.49506: Evaluated conditional (network_state != {}): False 18699 1726882351.49514: when evaluation is False, skipping this task 18699 1726882351.49521: _execute() done 18699 1726882351.49540: dumping result to json 18699 1726882351.49582: done dumping result, returning 18699 1726882351.49591: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12673a56-9f93-1ce6-d207-00000000003e] 18699 1726882351.49596: sending task result for task 12673a56-9f93-1ce6-d207-00000000003e 18699 1726882351.49800: done sending task result for task 12673a56-9f93-1ce6-d207-00000000003e 18699 1726882351.49803: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 18699 1726882351.49862: no more pending results, returning what we have 18699 1726882351.49866: results queue empty 18699 1726882351.49867: checking for any_errors_fatal 18699 1726882351.49875: done checking for any_errors_fatal 18699 1726882351.49876: checking for max_fail_percentage 18699 1726882351.49877: done checking for max_fail_percentage 18699 1726882351.49878: checking to see if all hosts have failed and the running result is not ok 18699 1726882351.49879: done checking to see if all hosts have failed 18699 1726882351.49880: getting the remaining hosts for this loop 18699 1726882351.49881: done getting the remaining hosts for this loop 18699 1726882351.49886: getting the next task for host managed_node1 18699 1726882351.49892: done getting next task for host managed_node1 18699 1726882351.49899: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 18699 1726882351.49902: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882351.49919: getting variables 18699 1726882351.49920: in VariableManager get_vars() 18699 1726882351.49957: Calling all_inventory to load vars for managed_node1 18699 1726882351.50085: Calling groups_inventory to load vars for managed_node1 18699 1726882351.50089: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882351.50102: Calling all_plugins_play to load vars for managed_node1 18699 1726882351.50105: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882351.50108: Calling groups_plugins_play to load vars for managed_node1 18699 1726882351.52311: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882351.55857: done with get_vars() 18699 1726882351.55910: done getting variables 18699 1726882351.56002: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:32:31 -0400 (0:00:00.084) 0:00:25.157 ****** 18699 1726882351.56123: entering _queue_task() for managed_node1/fail 18699 1726882351.57024: worker is 1 (out of 1 available) 18699 1726882351.57037: exiting _queue_task() for managed_node1/fail 18699 1726882351.57049: done queuing things up, now waiting for results queue to drain 18699 1726882351.57050: waiting for pending results... 18699 1726882351.57645: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 18699 1726882351.57846: in run() - task 12673a56-9f93-1ce6-d207-00000000003f 18699 1726882351.57902: variable 'ansible_search_path' from source: unknown 18699 1726882351.57906: variable 'ansible_search_path' from source: unknown 18699 1726882351.57947: calling self._execute() 18699 1726882351.58041: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882351.58055: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882351.58072: variable 'omit' from source: magic vars 18699 1726882351.58603: variable 'ansible_distribution_major_version' from source: facts 18699 1726882351.58606: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882351.58684: variable 'network_state' from source: role '' defaults 18699 1726882351.58710: Evaluated conditional (network_state != {}): False 18699 1726882351.58717: when evaluation is False, skipping this task 18699 1726882351.58720: _execute() done 18699 1726882351.58723: dumping result to json 18699 1726882351.58725: done dumping result, returning 18699 1726882351.58733: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12673a56-9f93-1ce6-d207-00000000003f] 18699 1726882351.58751: sending task result for task 12673a56-9f93-1ce6-d207-00000000003f 18699 1726882351.59043: done sending task result for task 12673a56-9f93-1ce6-d207-00000000003f 18699 1726882351.59047: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 18699 1726882351.59099: no more pending results, returning what we have 18699 1726882351.59102: results queue empty 18699 1726882351.59103: checking for any_errors_fatal 18699 1726882351.59111: done checking for any_errors_fatal 18699 1726882351.59111: checking for max_fail_percentage 18699 1726882351.59113: done checking for max_fail_percentage 18699 1726882351.59114: checking to see if all hosts have failed and the running result is not ok 18699 1726882351.59115: done checking to see if all hosts have failed 18699 1726882351.59115: getting the remaining hosts for this loop 18699 1726882351.59117: done getting the remaining hosts for this loop 18699 1726882351.59120: getting the next task for host managed_node1 18699 1726882351.59125: done getting next task for host managed_node1 18699 1726882351.59133: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 18699 1726882351.59139: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882351.59156: getting variables 18699 1726882351.59157: in VariableManager get_vars() 18699 1726882351.59188: Calling all_inventory to load vars for managed_node1 18699 1726882351.59190: Calling groups_inventory to load vars for managed_node1 18699 1726882351.59194: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882351.59203: Calling all_plugins_play to load vars for managed_node1 18699 1726882351.59206: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882351.59208: Calling groups_plugins_play to load vars for managed_node1 18699 1726882351.61026: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882351.64383: done with get_vars() 18699 1726882351.64505: done getting variables 18699 1726882351.64681: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:32:31 -0400 (0:00:00.086) 0:00:25.243 ****** 18699 1726882351.64726: entering _queue_task() for managed_node1/fail 18699 1726882351.65526: worker is 1 (out of 1 available) 18699 1726882351.65545: exiting _queue_task() for managed_node1/fail 18699 1726882351.65557: done queuing things up, now waiting for results queue to drain 18699 1726882351.65558: waiting for pending results... 18699 1726882351.66149: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 18699 1726882351.66668: in run() - task 12673a56-9f93-1ce6-d207-000000000040 18699 1726882351.66673: variable 'ansible_search_path' from source: unknown 18699 1726882351.66676: variable 'ansible_search_path' from source: unknown 18699 1726882351.66678: calling self._execute() 18699 1726882351.67119: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882351.67123: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882351.67125: variable 'omit' from source: magic vars 18699 1726882351.68039: variable 'ansible_distribution_major_version' from source: facts 18699 1726882351.68042: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882351.68445: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18699 1726882351.73156: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18699 1726882351.73237: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18699 1726882351.73301: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18699 1726882351.73352: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18699 1726882351.73384: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18699 1726882351.73552: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882351.74033: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882351.74080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882351.74150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882351.74178: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882351.74344: variable 'ansible_distribution_major_version' from source: facts 18699 1726882351.74364: Evaluated conditional (ansible_distribution_major_version | int > 9): True 18699 1726882351.74507: variable 'ansible_distribution' from source: facts 18699 1726882351.74517: variable '__network_rh_distros' from source: role '' defaults 18699 1726882351.74532: Evaluated conditional (ansible_distribution in __network_rh_distros): True 18699 1726882351.74936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882351.75001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882351.75034: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882351.75109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882351.75120: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882351.75199: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882351.75237: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882351.75336: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882351.75340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882351.75357: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882351.75402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882351.75436: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882351.75463: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882351.75504: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882351.75556: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882351.76209: variable 'network_connections' from source: play vars 18699 1726882351.76212: variable 'profile' from source: play vars 18699 1726882351.76319: variable 'profile' from source: play vars 18699 1726882351.76322: variable 'interface' from source: set_fact 18699 1726882351.76428: variable 'interface' from source: set_fact 18699 1726882351.76438: variable 'network_state' from source: role '' defaults 18699 1726882351.76571: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18699 1726882351.76703: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18699 1726882351.76804: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18699 1726882351.76807: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18699 1726882351.76810: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18699 1726882351.76875: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18699 1726882351.76898: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18699 1726882351.76905: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882351.76938: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18699 1726882351.77113: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 18699 1726882351.77178: when evaluation is False, skipping this task 18699 1726882351.77186: _execute() done 18699 1726882351.77189: dumping result to json 18699 1726882351.77199: done dumping result, returning 18699 1726882351.77205: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12673a56-9f93-1ce6-d207-000000000040] 18699 1726882351.77210: sending task result for task 12673a56-9f93-1ce6-d207-000000000040 skipping: [managed_node1] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 18699 1726882351.77448: no more pending results, returning what we have 18699 1726882351.77451: results queue empty 18699 1726882351.77452: checking for any_errors_fatal 18699 1726882351.77459: done checking for any_errors_fatal 18699 1726882351.77459: checking for max_fail_percentage 18699 1726882351.77461: done checking for max_fail_percentage 18699 1726882351.77462: checking to see if all hosts have failed and the running result is not ok 18699 1726882351.77463: done checking to see if all hosts have failed 18699 1726882351.77463: getting the remaining hosts for this loop 18699 1726882351.77465: done getting the remaining hosts for this loop 18699 1726882351.77469: getting the next task for host managed_node1 18699 1726882351.77475: done getting next task for host managed_node1 18699 1726882351.77479: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 18699 1726882351.77481: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882351.77500: getting variables 18699 1726882351.77502: in VariableManager get_vars() 18699 1726882351.77541: Calling all_inventory to load vars for managed_node1 18699 1726882351.77544: Calling groups_inventory to load vars for managed_node1 18699 1726882351.77547: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882351.77558: Calling all_plugins_play to load vars for managed_node1 18699 1726882351.77561: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882351.77564: Calling groups_plugins_play to load vars for managed_node1 18699 1726882351.78109: done sending task result for task 12673a56-9f93-1ce6-d207-000000000040 18699 1726882351.78113: WORKER PROCESS EXITING 18699 1726882351.79633: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882351.81414: done with get_vars() 18699 1726882351.81437: done getting variables 18699 1726882351.81501: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:32:31 -0400 (0:00:00.168) 0:00:25.411 ****** 18699 1726882351.81530: entering _queue_task() for managed_node1/dnf 18699 1726882351.81866: worker is 1 (out of 1 available) 18699 1726882351.81878: exiting _queue_task() for managed_node1/dnf 18699 1726882351.81889: done queuing things up, now waiting for results queue to drain 18699 1726882351.81890: waiting for pending results... 18699 1726882351.82176: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 18699 1726882351.82283: in run() - task 12673a56-9f93-1ce6-d207-000000000041 18699 1726882351.82308: variable 'ansible_search_path' from source: unknown 18699 1726882351.82319: variable 'ansible_search_path' from source: unknown 18699 1726882351.82372: calling self._execute() 18699 1726882351.82512: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882351.82524: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882351.82539: variable 'omit' from source: magic vars 18699 1726882351.82937: variable 'ansible_distribution_major_version' from source: facts 18699 1726882351.82953: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882351.83163: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18699 1726882351.85742: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18699 1726882351.85801: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18699 1726882351.85999: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18699 1726882351.86003: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18699 1726882351.86005: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18699 1726882351.86013: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882351.86050: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882351.86083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882351.86136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882351.86154: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882351.86287: variable 'ansible_distribution' from source: facts 18699 1726882351.86300: variable 'ansible_distribution_major_version' from source: facts 18699 1726882351.86321: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 18699 1726882351.86458: variable '__network_wireless_connections_defined' from source: role '' defaults 18699 1726882351.86600: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882351.86629: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882351.86656: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882351.86708: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882351.86775: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882351.86778: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882351.86803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882351.86833: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882351.86874: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882351.86904: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882351.86946: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882351.86972: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882351.87009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882351.87108: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882351.87112: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882351.87245: variable 'network_connections' from source: play vars 18699 1726882351.87263: variable 'profile' from source: play vars 18699 1726882351.87344: variable 'profile' from source: play vars 18699 1726882351.87353: variable 'interface' from source: set_fact 18699 1726882351.87416: variable 'interface' from source: set_fact 18699 1726882351.87497: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18699 1726882351.87676: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18699 1726882351.87760: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18699 1726882351.87763: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18699 1726882351.87791: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18699 1726882351.87838: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18699 1726882351.87871: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18699 1726882351.87913: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882351.87978: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18699 1726882351.88004: variable '__network_team_connections_defined' from source: role '' defaults 18699 1726882351.88237: variable 'network_connections' from source: play vars 18699 1726882351.88247: variable 'profile' from source: play vars 18699 1726882351.88309: variable 'profile' from source: play vars 18699 1726882351.88317: variable 'interface' from source: set_fact 18699 1726882351.88412: variable 'interface' from source: set_fact 18699 1726882351.88415: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 18699 1726882351.88425: when evaluation is False, skipping this task 18699 1726882351.88433: _execute() done 18699 1726882351.88440: dumping result to json 18699 1726882351.88447: done dumping result, returning 18699 1726882351.88458: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12673a56-9f93-1ce6-d207-000000000041] 18699 1726882351.88521: sending task result for task 12673a56-9f93-1ce6-d207-000000000041 18699 1726882351.88656: done sending task result for task 12673a56-9f93-1ce6-d207-000000000041 18699 1726882351.88660: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 18699 1726882351.88717: no more pending results, returning what we have 18699 1726882351.88721: results queue empty 18699 1726882351.88722: checking for any_errors_fatal 18699 1726882351.88897: done checking for any_errors_fatal 18699 1726882351.88899: checking for max_fail_percentage 18699 1726882351.88901: done checking for max_fail_percentage 18699 1726882351.88902: checking to see if all hosts have failed and the running result is not ok 18699 1726882351.88903: done checking to see if all hosts have failed 18699 1726882351.88904: getting the remaining hosts for this loop 18699 1726882351.88906: done getting the remaining hosts for this loop 18699 1726882351.88910: getting the next task for host managed_node1 18699 1726882351.88916: done getting next task for host managed_node1 18699 1726882351.88920: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 18699 1726882351.88922: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882351.88937: getting variables 18699 1726882351.88939: in VariableManager get_vars() 18699 1726882351.88978: Calling all_inventory to load vars for managed_node1 18699 1726882351.88981: Calling groups_inventory to load vars for managed_node1 18699 1726882351.88984: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882351.89046: Calling all_plugins_play to load vars for managed_node1 18699 1726882351.89051: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882351.89056: Calling groups_plugins_play to load vars for managed_node1 18699 1726882351.90715: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882351.92359: done with get_vars() 18699 1726882351.92391: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 18699 1726882351.92467: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:32:31 -0400 (0:00:00.109) 0:00:25.521 ****** 18699 1726882351.92505: entering _queue_task() for managed_node1/yum 18699 1726882351.92937: worker is 1 (out of 1 available) 18699 1726882351.92949: exiting _queue_task() for managed_node1/yum 18699 1726882351.92959: done queuing things up, now waiting for results queue to drain 18699 1726882351.92959: waiting for pending results... 18699 1726882351.93192: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 18699 1726882351.93365: in run() - task 12673a56-9f93-1ce6-d207-000000000042 18699 1726882351.93369: variable 'ansible_search_path' from source: unknown 18699 1726882351.93371: variable 'ansible_search_path' from source: unknown 18699 1726882351.93385: calling self._execute() 18699 1726882351.93484: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882351.93499: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882351.93515: variable 'omit' from source: magic vars 18699 1726882351.93910: variable 'ansible_distribution_major_version' from source: facts 18699 1726882351.93915: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882351.94128: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18699 1726882351.96632: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18699 1726882351.96709: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18699 1726882351.96759: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18699 1726882351.96799: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18699 1726882351.96833: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18699 1726882351.96957: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882351.96992: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882351.97029: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882351.97101: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882351.97136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882351.97284: variable 'ansible_distribution_major_version' from source: facts 18699 1726882351.97288: Evaluated conditional (ansible_distribution_major_version | int < 8): False 18699 1726882351.97292: when evaluation is False, skipping this task 18699 1726882351.97302: _execute() done 18699 1726882351.97310: dumping result to json 18699 1726882351.97336: done dumping result, returning 18699 1726882351.97339: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12673a56-9f93-1ce6-d207-000000000042] 18699 1726882351.97342: sending task result for task 12673a56-9f93-1ce6-d207-000000000042 18699 1726882351.97706: done sending task result for task 12673a56-9f93-1ce6-d207-000000000042 18699 1726882351.97710: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 18699 1726882351.97764: no more pending results, returning what we have 18699 1726882351.97767: results queue empty 18699 1726882351.97768: checking for any_errors_fatal 18699 1726882351.97774: done checking for any_errors_fatal 18699 1726882351.97774: checking for max_fail_percentage 18699 1726882351.97776: done checking for max_fail_percentage 18699 1726882351.97777: checking to see if all hosts have failed and the running result is not ok 18699 1726882351.97778: done checking to see if all hosts have failed 18699 1726882351.97779: getting the remaining hosts for this loop 18699 1726882351.97780: done getting the remaining hosts for this loop 18699 1726882351.97784: getting the next task for host managed_node1 18699 1726882351.97789: done getting next task for host managed_node1 18699 1726882351.97795: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 18699 1726882351.97798: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882351.97811: getting variables 18699 1726882351.97813: in VariableManager get_vars() 18699 1726882351.97851: Calling all_inventory to load vars for managed_node1 18699 1726882351.97855: Calling groups_inventory to load vars for managed_node1 18699 1726882351.97857: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882351.97867: Calling all_plugins_play to load vars for managed_node1 18699 1726882351.97870: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882351.97873: Calling groups_plugins_play to load vars for managed_node1 18699 1726882351.99519: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882352.01126: done with get_vars() 18699 1726882352.01148: done getting variables 18699 1726882352.01211: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:32:32 -0400 (0:00:00.087) 0:00:25.608 ****** 18699 1726882352.01245: entering _queue_task() for managed_node1/fail 18699 1726882352.01600: worker is 1 (out of 1 available) 18699 1726882352.01612: exiting _queue_task() for managed_node1/fail 18699 1726882352.01627: done queuing things up, now waiting for results queue to drain 18699 1726882352.01628: waiting for pending results... 18699 1726882352.01982: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 18699 1726882352.02096: in run() - task 12673a56-9f93-1ce6-d207-000000000043 18699 1726882352.02117: variable 'ansible_search_path' from source: unknown 18699 1726882352.02130: variable 'ansible_search_path' from source: unknown 18699 1726882352.02170: calling self._execute() 18699 1726882352.02305: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882352.02316: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882352.02331: variable 'omit' from source: magic vars 18699 1726882352.02746: variable 'ansible_distribution_major_version' from source: facts 18699 1726882352.02765: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882352.02900: variable '__network_wireless_connections_defined' from source: role '' defaults 18699 1726882352.03204: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18699 1726882352.05471: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18699 1726882352.05551: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18699 1726882352.05591: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18699 1726882352.05636: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18699 1726882352.05674: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18699 1726882352.05760: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882352.05817: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882352.05848: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882352.05903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882352.05923: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882352.05972: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882352.06099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882352.06102: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882352.06105: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882352.06107: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882352.06137: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882352.06185: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882352.06240: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882352.06313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882352.06345: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882352.06622: variable 'network_connections' from source: play vars 18699 1726882352.06649: variable 'profile' from source: play vars 18699 1726882352.06738: variable 'profile' from source: play vars 18699 1726882352.06779: variable 'interface' from source: set_fact 18699 1726882352.06847: variable 'interface' from source: set_fact 18699 1726882352.07102: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18699 1726882352.07347: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18699 1726882352.07369: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18699 1726882352.07411: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18699 1726882352.07457: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18699 1726882352.07510: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18699 1726882352.07545: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18699 1726882352.07582: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882352.07641: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18699 1726882352.07684: variable '__network_team_connections_defined' from source: role '' defaults 18699 1726882352.07950: variable 'network_connections' from source: play vars 18699 1726882352.07967: variable 'profile' from source: play vars 18699 1726882352.08077: variable 'profile' from source: play vars 18699 1726882352.08081: variable 'interface' from source: set_fact 18699 1726882352.08125: variable 'interface' from source: set_fact 18699 1726882352.08154: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 18699 1726882352.08161: when evaluation is False, skipping this task 18699 1726882352.08168: _execute() done 18699 1726882352.08174: dumping result to json 18699 1726882352.08186: done dumping result, returning 18699 1726882352.08204: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12673a56-9f93-1ce6-d207-000000000043] 18699 1726882352.08301: sending task result for task 12673a56-9f93-1ce6-d207-000000000043 18699 1726882352.08552: done sending task result for task 12673a56-9f93-1ce6-d207-000000000043 18699 1726882352.08556: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 18699 1726882352.08626: no more pending results, returning what we have 18699 1726882352.08629: results queue empty 18699 1726882352.08630: checking for any_errors_fatal 18699 1726882352.08636: done checking for any_errors_fatal 18699 1726882352.08637: checking for max_fail_percentage 18699 1726882352.08639: done checking for max_fail_percentage 18699 1726882352.08640: checking to see if all hosts have failed and the running result is not ok 18699 1726882352.08641: done checking to see if all hosts have failed 18699 1726882352.08642: getting the remaining hosts for this loop 18699 1726882352.08643: done getting the remaining hosts for this loop 18699 1726882352.08646: getting the next task for host managed_node1 18699 1726882352.08652: done getting next task for host managed_node1 18699 1726882352.08656: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 18699 1726882352.08658: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882352.08684: getting variables 18699 1726882352.08686: in VariableManager get_vars() 18699 1726882352.08731: Calling all_inventory to load vars for managed_node1 18699 1726882352.08734: Calling groups_inventory to load vars for managed_node1 18699 1726882352.08740: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882352.08750: Calling all_plugins_play to load vars for managed_node1 18699 1726882352.08753: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882352.08756: Calling groups_plugins_play to load vars for managed_node1 18699 1726882352.10408: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882352.12377: done with get_vars() 18699 1726882352.12403: done getting variables 18699 1726882352.12466: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:32:32 -0400 (0:00:00.112) 0:00:25.721 ****** 18699 1726882352.12499: entering _queue_task() for managed_node1/package 18699 1726882352.12976: worker is 1 (out of 1 available) 18699 1726882352.12987: exiting _queue_task() for managed_node1/package 18699 1726882352.13002: done queuing things up, now waiting for results queue to drain 18699 1726882352.13003: waiting for pending results... 18699 1726882352.13356: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 18699 1726882352.13389: in run() - task 12673a56-9f93-1ce6-d207-000000000044 18699 1726882352.13411: variable 'ansible_search_path' from source: unknown 18699 1726882352.13451: variable 'ansible_search_path' from source: unknown 18699 1726882352.13464: calling self._execute() 18699 1726882352.13564: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882352.13575: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882352.13589: variable 'omit' from source: magic vars 18699 1726882352.14120: variable 'ansible_distribution_major_version' from source: facts 18699 1726882352.14124: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882352.14317: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18699 1726882352.14612: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18699 1726882352.14666: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18699 1726882352.14708: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18699 1726882352.14792: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18699 1726882352.14911: variable 'network_packages' from source: role '' defaults 18699 1726882352.15030: variable '__network_provider_setup' from source: role '' defaults 18699 1726882352.15047: variable '__network_service_name_default_nm' from source: role '' defaults 18699 1726882352.15125: variable '__network_service_name_default_nm' from source: role '' defaults 18699 1726882352.15140: variable '__network_packages_default_nm' from source: role '' defaults 18699 1726882352.15209: variable '__network_packages_default_nm' from source: role '' defaults 18699 1726882352.15396: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18699 1726882352.17644: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18699 1726882352.17717: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18699 1726882352.17815: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18699 1726882352.17820: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18699 1726882352.17830: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18699 1726882352.17909: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882352.17999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882352.18003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882352.18040: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882352.18061: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882352.18112: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882352.18149: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882352.18180: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882352.18226: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882352.18298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882352.18546: variable '__network_packages_default_gobject_packages' from source: role '' defaults 18699 1726882352.18677: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882352.18800: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882352.18803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882352.18805: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882352.18813: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882352.18906: variable 'ansible_python' from source: facts 18699 1726882352.18935: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 18699 1726882352.19025: variable '__network_wpa_supplicant_required' from source: role '' defaults 18699 1726882352.19109: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 18699 1726882352.19253: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882352.19282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882352.19313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882352.19355: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882352.19373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882352.19428: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882352.19491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882352.19500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882352.19543: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882352.19602: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882352.19720: variable 'network_connections' from source: play vars 18699 1726882352.19730: variable 'profile' from source: play vars 18699 1726882352.19840: variable 'profile' from source: play vars 18699 1726882352.19852: variable 'interface' from source: set_fact 18699 1726882352.19934: variable 'interface' from source: set_fact 18699 1726882352.20062: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18699 1726882352.20080: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18699 1726882352.20148: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882352.20232: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18699 1726882352.20288: variable '__network_wireless_connections_defined' from source: role '' defaults 18699 1726882352.21031: variable 'network_connections' from source: play vars 18699 1726882352.21034: variable 'profile' from source: play vars 18699 1726882352.21111: variable 'profile' from source: play vars 18699 1726882352.21206: variable 'interface' from source: set_fact 18699 1726882352.21379: variable 'interface' from source: set_fact 18699 1726882352.21466: variable '__network_packages_default_wireless' from source: role '' defaults 18699 1726882352.21519: variable '__network_wireless_connections_defined' from source: role '' defaults 18699 1726882352.21932: variable 'network_connections' from source: play vars 18699 1726882352.21942: variable 'profile' from source: play vars 18699 1726882352.22202: variable 'profile' from source: play vars 18699 1726882352.22206: variable 'interface' from source: set_fact 18699 1726882352.22322: variable 'interface' from source: set_fact 18699 1726882352.22357: variable '__network_packages_default_team' from source: role '' defaults 18699 1726882352.22470: variable '__network_team_connections_defined' from source: role '' defaults 18699 1726882352.23227: variable 'network_connections' from source: play vars 18699 1726882352.23301: variable 'profile' from source: play vars 18699 1726882352.23371: variable 'profile' from source: play vars 18699 1726882352.23437: variable 'interface' from source: set_fact 18699 1726882352.23600: variable 'interface' from source: set_fact 18699 1726882352.23680: variable '__network_service_name_default_initscripts' from source: role '' defaults 18699 1726882352.23753: variable '__network_service_name_default_initscripts' from source: role '' defaults 18699 1726882352.23765: variable '__network_packages_default_initscripts' from source: role '' defaults 18699 1726882352.23830: variable '__network_packages_default_initscripts' from source: role '' defaults 18699 1726882352.24066: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 18699 1726882352.24580: variable 'network_connections' from source: play vars 18699 1726882352.24591: variable 'profile' from source: play vars 18699 1726882352.24668: variable 'profile' from source: play vars 18699 1726882352.24678: variable 'interface' from source: set_fact 18699 1726882352.24800: variable 'interface' from source: set_fact 18699 1726882352.24803: variable 'ansible_distribution' from source: facts 18699 1726882352.24806: variable '__network_rh_distros' from source: role '' defaults 18699 1726882352.24808: variable 'ansible_distribution_major_version' from source: facts 18699 1726882352.24811: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 18699 1726882352.24990: variable 'ansible_distribution' from source: facts 18699 1726882352.25008: variable '__network_rh_distros' from source: role '' defaults 18699 1726882352.25019: variable 'ansible_distribution_major_version' from source: facts 18699 1726882352.25043: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 18699 1726882352.25229: variable 'ansible_distribution' from source: facts 18699 1726882352.25423: variable '__network_rh_distros' from source: role '' defaults 18699 1726882352.25426: variable 'ansible_distribution_major_version' from source: facts 18699 1726882352.25428: variable 'network_provider' from source: set_fact 18699 1726882352.25430: variable 'ansible_facts' from source: unknown 18699 1726882352.26933: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 18699 1726882352.26937: when evaluation is False, skipping this task 18699 1726882352.26944: _execute() done 18699 1726882352.26947: dumping result to json 18699 1726882352.26949: done dumping result, returning 18699 1726882352.26952: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [12673a56-9f93-1ce6-d207-000000000044] 18699 1726882352.26956: sending task result for task 12673a56-9f93-1ce6-d207-000000000044 skipping: [managed_node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 18699 1726882352.27173: no more pending results, returning what we have 18699 1726882352.27177: results queue empty 18699 1726882352.27179: checking for any_errors_fatal 18699 1726882352.27186: done checking for any_errors_fatal 18699 1726882352.27187: checking for max_fail_percentage 18699 1726882352.27189: done checking for max_fail_percentage 18699 1726882352.27190: checking to see if all hosts have failed and the running result is not ok 18699 1726882352.27190: done checking to see if all hosts have failed 18699 1726882352.27191: getting the remaining hosts for this loop 18699 1726882352.27198: done getting the remaining hosts for this loop 18699 1726882352.27202: getting the next task for host managed_node1 18699 1726882352.27210: done getting next task for host managed_node1 18699 1726882352.27215: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 18699 1726882352.27217: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882352.27233: getting variables 18699 1726882352.27235: in VariableManager get_vars() 18699 1726882352.27274: Calling all_inventory to load vars for managed_node1 18699 1726882352.27278: Calling groups_inventory to load vars for managed_node1 18699 1726882352.27280: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882352.27291: Calling all_plugins_play to load vars for managed_node1 18699 1726882352.27503: done sending task result for task 12673a56-9f93-1ce6-d207-000000000044 18699 1726882352.27580: WORKER PROCESS EXITING 18699 1726882352.27513: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882352.27621: Calling groups_plugins_play to load vars for managed_node1 18699 1726882352.29856: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882352.32101: done with get_vars() 18699 1726882352.32128: done getting variables 18699 1726882352.32177: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:32:32 -0400 (0:00:00.197) 0:00:25.918 ****** 18699 1726882352.32206: entering _queue_task() for managed_node1/package 18699 1726882352.32481: worker is 1 (out of 1 available) 18699 1726882352.32500: exiting _queue_task() for managed_node1/package 18699 1726882352.32512: done queuing things up, now waiting for results queue to drain 18699 1726882352.32513: waiting for pending results... 18699 1726882352.32704: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 18699 1726882352.32771: in run() - task 12673a56-9f93-1ce6-d207-000000000045 18699 1726882352.32782: variable 'ansible_search_path' from source: unknown 18699 1726882352.32785: variable 'ansible_search_path' from source: unknown 18699 1726882352.32819: calling self._execute() 18699 1726882352.32898: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882352.32902: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882352.32912: variable 'omit' from source: magic vars 18699 1726882352.33196: variable 'ansible_distribution_major_version' from source: facts 18699 1726882352.33205: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882352.33291: variable 'network_state' from source: role '' defaults 18699 1726882352.33302: Evaluated conditional (network_state != {}): False 18699 1726882352.33305: when evaluation is False, skipping this task 18699 1726882352.33308: _execute() done 18699 1726882352.33312: dumping result to json 18699 1726882352.33314: done dumping result, returning 18699 1726882352.33324: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12673a56-9f93-1ce6-d207-000000000045] 18699 1726882352.33370: sending task result for task 12673a56-9f93-1ce6-d207-000000000045 18699 1726882352.33448: done sending task result for task 12673a56-9f93-1ce6-d207-000000000045 18699 1726882352.33450: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 18699 1726882352.33532: no more pending results, returning what we have 18699 1726882352.33536: results queue empty 18699 1726882352.33537: checking for any_errors_fatal 18699 1726882352.33544: done checking for any_errors_fatal 18699 1726882352.33544: checking for max_fail_percentage 18699 1726882352.33546: done checking for max_fail_percentage 18699 1726882352.33547: checking to see if all hosts have failed and the running result is not ok 18699 1726882352.33548: done checking to see if all hosts have failed 18699 1726882352.33548: getting the remaining hosts for this loop 18699 1726882352.33550: done getting the remaining hosts for this loop 18699 1726882352.33553: getting the next task for host managed_node1 18699 1726882352.33559: done getting next task for host managed_node1 18699 1726882352.33562: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 18699 1726882352.33564: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882352.33619: getting variables 18699 1726882352.33621: in VariableManager get_vars() 18699 1726882352.33653: Calling all_inventory to load vars for managed_node1 18699 1726882352.33658: Calling groups_inventory to load vars for managed_node1 18699 1726882352.33661: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882352.33669: Calling all_plugins_play to load vars for managed_node1 18699 1726882352.33671: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882352.33674: Calling groups_plugins_play to load vars for managed_node1 18699 1726882352.35343: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882352.37209: done with get_vars() 18699 1726882352.37244: done getting variables 18699 1726882352.37324: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:32:32 -0400 (0:00:00.053) 0:00:25.971 ****** 18699 1726882352.37527: entering _queue_task() for managed_node1/package 18699 1726882352.38109: worker is 1 (out of 1 available) 18699 1726882352.38121: exiting _queue_task() for managed_node1/package 18699 1726882352.38131: done queuing things up, now waiting for results queue to drain 18699 1726882352.38132: waiting for pending results... 18699 1726882352.38374: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 18699 1726882352.38419: in run() - task 12673a56-9f93-1ce6-d207-000000000046 18699 1726882352.38457: variable 'ansible_search_path' from source: unknown 18699 1726882352.38476: variable 'ansible_search_path' from source: unknown 18699 1726882352.38524: calling self._execute() 18699 1726882352.38638: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882352.38651: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882352.38690: variable 'omit' from source: magic vars 18699 1726882352.39091: variable 'ansible_distribution_major_version' from source: facts 18699 1726882352.39197: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882352.39260: variable 'network_state' from source: role '' defaults 18699 1726882352.39276: Evaluated conditional (network_state != {}): False 18699 1726882352.39283: when evaluation is False, skipping this task 18699 1726882352.39290: _execute() done 18699 1726882352.39303: dumping result to json 18699 1726882352.39314: done dumping result, returning 18699 1726882352.39339: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12673a56-9f93-1ce6-d207-000000000046] 18699 1726882352.39342: sending task result for task 12673a56-9f93-1ce6-d207-000000000046 18699 1726882352.39504: done sending task result for task 12673a56-9f93-1ce6-d207-000000000046 18699 1726882352.39507: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 18699 1726882352.39579: no more pending results, returning what we have 18699 1726882352.39583: results queue empty 18699 1726882352.39584: checking for any_errors_fatal 18699 1726882352.39592: done checking for any_errors_fatal 18699 1726882352.39596: checking for max_fail_percentage 18699 1726882352.39599: done checking for max_fail_percentage 18699 1726882352.39600: checking to see if all hosts have failed and the running result is not ok 18699 1726882352.39600: done checking to see if all hosts have failed 18699 1726882352.39601: getting the remaining hosts for this loop 18699 1726882352.39603: done getting the remaining hosts for this loop 18699 1726882352.39607: getting the next task for host managed_node1 18699 1726882352.39615: done getting next task for host managed_node1 18699 1726882352.39618: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 18699 1726882352.39621: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882352.39637: getting variables 18699 1726882352.39640: in VariableManager get_vars() 18699 1726882352.39801: Calling all_inventory to load vars for managed_node1 18699 1726882352.39805: Calling groups_inventory to load vars for managed_node1 18699 1726882352.39810: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882352.39823: Calling all_plugins_play to load vars for managed_node1 18699 1726882352.39826: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882352.39829: Calling groups_plugins_play to load vars for managed_node1 18699 1726882352.41450: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882352.42989: done with get_vars() 18699 1726882352.43019: done getting variables 18699 1726882352.43079: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:32:32 -0400 (0:00:00.055) 0:00:26.027 ****** 18699 1726882352.43116: entering _queue_task() for managed_node1/service 18699 1726882352.43466: worker is 1 (out of 1 available) 18699 1726882352.43480: exiting _queue_task() for managed_node1/service 18699 1726882352.43495: done queuing things up, now waiting for results queue to drain 18699 1726882352.43496: waiting for pending results... 18699 1726882352.43862: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 18699 1726882352.43922: in run() - task 12673a56-9f93-1ce6-d207-000000000047 18699 1726882352.43959: variable 'ansible_search_path' from source: unknown 18699 1726882352.43999: variable 'ansible_search_path' from source: unknown 18699 1726882352.44004: calling self._execute() 18699 1726882352.44112: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882352.44116: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882352.44127: variable 'omit' from source: magic vars 18699 1726882352.44627: variable 'ansible_distribution_major_version' from source: facts 18699 1726882352.44631: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882352.44684: variable '__network_wireless_connections_defined' from source: role '' defaults 18699 1726882352.44889: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18699 1726882352.47609: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18699 1726882352.47674: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18699 1726882352.47707: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18699 1726882352.47738: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18699 1726882352.47768: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18699 1726882352.47845: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882352.47875: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882352.47900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882352.47935: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882352.47950: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882352.48001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882352.48019: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882352.48042: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882352.48086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882352.48106: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882352.48171: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882352.48174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882352.48187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882352.48234: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882352.48248: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882352.48439: variable 'network_connections' from source: play vars 18699 1726882352.48442: variable 'profile' from source: play vars 18699 1726882352.48552: variable 'profile' from source: play vars 18699 1726882352.48555: variable 'interface' from source: set_fact 18699 1726882352.48574: variable 'interface' from source: set_fact 18699 1726882352.48656: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18699 1726882352.55420: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18699 1726882352.55462: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18699 1726882352.55535: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18699 1726882352.55538: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18699 1726882352.55566: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18699 1726882352.55591: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18699 1726882352.55617: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882352.55644: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18699 1726882352.55690: variable '__network_team_connections_defined' from source: role '' defaults 18699 1726882352.56075: variable 'network_connections' from source: play vars 18699 1726882352.56078: variable 'profile' from source: play vars 18699 1726882352.56198: variable 'profile' from source: play vars 18699 1726882352.56202: variable 'interface' from source: set_fact 18699 1726882352.56208: variable 'interface' from source: set_fact 18699 1726882352.56234: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 18699 1726882352.56238: when evaluation is False, skipping this task 18699 1726882352.56240: _execute() done 18699 1726882352.56243: dumping result to json 18699 1726882352.56244: done dumping result, returning 18699 1726882352.56250: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12673a56-9f93-1ce6-d207-000000000047] 18699 1726882352.56259: sending task result for task 12673a56-9f93-1ce6-d207-000000000047 18699 1726882352.56475: done sending task result for task 12673a56-9f93-1ce6-d207-000000000047 18699 1726882352.56480: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 18699 1726882352.56525: no more pending results, returning what we have 18699 1726882352.56528: results queue empty 18699 1726882352.56529: checking for any_errors_fatal 18699 1726882352.56536: done checking for any_errors_fatal 18699 1726882352.56537: checking for max_fail_percentage 18699 1726882352.56539: done checking for max_fail_percentage 18699 1726882352.56540: checking to see if all hosts have failed and the running result is not ok 18699 1726882352.56540: done checking to see if all hosts have failed 18699 1726882352.56541: getting the remaining hosts for this loop 18699 1726882352.56543: done getting the remaining hosts for this loop 18699 1726882352.56547: getting the next task for host managed_node1 18699 1726882352.56553: done getting next task for host managed_node1 18699 1726882352.56556: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 18699 1726882352.56559: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882352.56573: getting variables 18699 1726882352.56574: in VariableManager get_vars() 18699 1726882352.56615: Calling all_inventory to load vars for managed_node1 18699 1726882352.56618: Calling groups_inventory to load vars for managed_node1 18699 1726882352.56621: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882352.56632: Calling all_plugins_play to load vars for managed_node1 18699 1726882352.56634: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882352.56637: Calling groups_plugins_play to load vars for managed_node1 18699 1726882352.70267: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882352.72133: done with get_vars() 18699 1726882352.72171: done getting variables 18699 1726882352.72244: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:32:32 -0400 (0:00:00.293) 0:00:26.320 ****** 18699 1726882352.72492: entering _queue_task() for managed_node1/service 18699 1726882352.72879: worker is 1 (out of 1 available) 18699 1726882352.72891: exiting _queue_task() for managed_node1/service 18699 1726882352.72904: done queuing things up, now waiting for results queue to drain 18699 1726882352.72905: waiting for pending results... 18699 1726882352.73439: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 18699 1726882352.73762: in run() - task 12673a56-9f93-1ce6-d207-000000000048 18699 1726882352.73767: variable 'ansible_search_path' from source: unknown 18699 1726882352.73774: variable 'ansible_search_path' from source: unknown 18699 1726882352.73815: calling self._execute() 18699 1726882352.73997: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882352.74003: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882352.74006: variable 'omit' from source: magic vars 18699 1726882352.75001: variable 'ansible_distribution_major_version' from source: facts 18699 1726882352.75005: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882352.75489: variable 'network_provider' from source: set_fact 18699 1726882352.75498: variable 'network_state' from source: role '' defaults 18699 1726882352.75502: Evaluated conditional (network_provider == "nm" or network_state != {}): True 18699 1726882352.75506: variable 'omit' from source: magic vars 18699 1726882352.75509: variable 'omit' from source: magic vars 18699 1726882352.75710: variable 'network_service_name' from source: role '' defaults 18699 1726882352.75803: variable 'network_service_name' from source: role '' defaults 18699 1726882352.76176: variable '__network_provider_setup' from source: role '' defaults 18699 1726882352.76203: variable '__network_service_name_default_nm' from source: role '' defaults 18699 1726882352.76434: variable '__network_service_name_default_nm' from source: role '' defaults 18699 1726882352.76524: variable '__network_packages_default_nm' from source: role '' defaults 18699 1726882352.76664: variable '__network_packages_default_nm' from source: role '' defaults 18699 1726882352.77290: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18699 1726882352.82548: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18699 1726882352.82827: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18699 1726882352.82972: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18699 1726882352.83080: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18699 1726882352.83166: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18699 1726882352.83408: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882352.83428: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882352.83511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882352.83705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882352.83710: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882352.83845: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882352.83875: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882352.83981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882352.84113: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882352.84166: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882352.85146: variable '__network_packages_default_gobject_packages' from source: role '' defaults 18699 1726882352.85387: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882352.85491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882352.85571: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882352.85796: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882352.85803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882352.86126: variable 'ansible_python' from source: facts 18699 1726882352.86134: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 18699 1726882352.86345: variable '__network_wpa_supplicant_required' from source: role '' defaults 18699 1726882352.86579: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 18699 1726882352.86917: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882352.86955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882352.86986: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882352.87046: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882352.87083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882352.87154: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882352.87205: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882352.87255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882352.87407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882352.87410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882352.87745: variable 'network_connections' from source: play vars 18699 1726882352.87874: variable 'profile' from source: play vars 18699 1726882352.88110: variable 'profile' from source: play vars 18699 1726882352.88114: variable 'interface' from source: set_fact 18699 1726882352.88169: variable 'interface' from source: set_fact 18699 1726882352.88599: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18699 1726882352.88918: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18699 1726882352.89024: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18699 1726882352.89090: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18699 1726882352.89140: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18699 1726882352.89221: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18699 1726882352.89265: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18699 1726882352.89324: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882352.89377: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18699 1726882352.89452: variable '__network_wireless_connections_defined' from source: role '' defaults 18699 1726882352.89944: variable 'network_connections' from source: play vars 18699 1726882352.89957: variable 'profile' from source: play vars 18699 1726882352.90055: variable 'profile' from source: play vars 18699 1726882352.90066: variable 'interface' from source: set_fact 18699 1726882352.90129: variable 'interface' from source: set_fact 18699 1726882352.90180: variable '__network_packages_default_wireless' from source: role '' defaults 18699 1726882352.90282: variable '__network_wireless_connections_defined' from source: role '' defaults 18699 1726882352.90823: variable 'network_connections' from source: play vars 18699 1726882352.90826: variable 'profile' from source: play vars 18699 1726882352.90880: variable 'profile' from source: play vars 18699 1726882352.90923: variable 'interface' from source: set_fact 18699 1726882352.91167: variable 'interface' from source: set_fact 18699 1726882352.91170: variable '__network_packages_default_team' from source: role '' defaults 18699 1726882352.91295: variable '__network_team_connections_defined' from source: role '' defaults 18699 1726882352.91786: variable 'network_connections' from source: play vars 18699 1726882352.91800: variable 'profile' from source: play vars 18699 1726882352.91887: variable 'profile' from source: play vars 18699 1726882352.91931: variable 'interface' from source: set_fact 18699 1726882352.92100: variable 'interface' from source: set_fact 18699 1726882352.92142: variable '__network_service_name_default_initscripts' from source: role '' defaults 18699 1726882352.92499: variable '__network_service_name_default_initscripts' from source: role '' defaults 18699 1726882352.92503: variable '__network_packages_default_initscripts' from source: role '' defaults 18699 1726882352.92505: variable '__network_packages_default_initscripts' from source: role '' defaults 18699 1726882352.92809: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 18699 1726882352.93576: variable 'network_connections' from source: play vars 18699 1726882352.93586: variable 'profile' from source: play vars 18699 1726882352.93837: variable 'profile' from source: play vars 18699 1726882352.93840: variable 'interface' from source: set_fact 18699 1726882352.93842: variable 'interface' from source: set_fact 18699 1726882352.93844: variable 'ansible_distribution' from source: facts 18699 1726882352.93846: variable '__network_rh_distros' from source: role '' defaults 18699 1726882352.93848: variable 'ansible_distribution_major_version' from source: facts 18699 1726882352.93864: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 18699 1726882352.94097: variable 'ansible_distribution' from source: facts 18699 1726882352.94107: variable '__network_rh_distros' from source: role '' defaults 18699 1726882352.94116: variable 'ansible_distribution_major_version' from source: facts 18699 1726882352.94141: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 18699 1726882352.94472: variable 'ansible_distribution' from source: facts 18699 1726882352.94597: variable '__network_rh_distros' from source: role '' defaults 18699 1726882352.94601: variable 'ansible_distribution_major_version' from source: facts 18699 1726882352.94605: variable 'network_provider' from source: set_fact 18699 1726882352.94607: variable 'omit' from source: magic vars 18699 1726882352.94729: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18699 1726882352.94733: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18699 1726882352.94735: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18699 1726882352.94737: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882352.94740: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882352.94772: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18699 1726882352.94780: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882352.94789: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882352.94905: Set connection var ansible_connection to ssh 18699 1726882352.94918: Set connection var ansible_pipelining to False 18699 1726882352.94928: Set connection var ansible_shell_executable to /bin/sh 18699 1726882352.94950: Set connection var ansible_timeout to 10 18699 1726882352.94962: Set connection var ansible_shell_type to sh 18699 1726882352.94981: Set connection var ansible_module_compression to ZIP_DEFLATED 18699 1726882352.95018: variable 'ansible_shell_executable' from source: unknown 18699 1726882352.95026: variable 'ansible_connection' from source: unknown 18699 1726882352.95034: variable 'ansible_module_compression' from source: unknown 18699 1726882352.95040: variable 'ansible_shell_type' from source: unknown 18699 1726882352.95061: variable 'ansible_shell_executable' from source: unknown 18699 1726882352.95068: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882352.95167: variable 'ansible_pipelining' from source: unknown 18699 1726882352.95170: variable 'ansible_timeout' from source: unknown 18699 1726882352.95172: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882352.95217: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18699 1726882352.95235: variable 'omit' from source: magic vars 18699 1726882352.95246: starting attempt loop 18699 1726882352.95252: running the handler 18699 1726882352.95346: variable 'ansible_facts' from source: unknown 18699 1726882352.96591: _low_level_execute_command(): starting 18699 1726882352.96624: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18699 1726882352.97645: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882352.97740: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882352.97940: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882352.99512: stdout chunk (state=3): >>>/root <<< 18699 1726882352.99663: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882352.99666: stdout chunk (state=3): >>><<< 18699 1726882352.99668: stderr chunk (state=3): >>><<< 18699 1726882352.99781: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882352.99784: _low_level_execute_command(): starting 18699 1726882352.99787: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882352.9969528-19973-96360225530399 `" && echo ansible-tmp-1726882352.9969528-19973-96360225530399="` echo /root/.ansible/tmp/ansible-tmp-1726882352.9969528-19973-96360225530399 `" ) && sleep 0' 18699 1726882353.00328: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882353.00331: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882353.00334: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882353.00336: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18699 1726882353.00339: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 18699 1726882353.00344: stderr chunk (state=3): >>>debug2: match not found <<< 18699 1726882353.00354: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882353.00368: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18699 1726882353.00437: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 18699 1726882353.00440: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 18699 1726882353.00442: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882353.00444: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882353.00446: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18699 1726882353.00448: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 18699 1726882353.00450: stderr chunk (state=3): >>>debug2: match found <<< 18699 1726882353.00452: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882353.00504: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882353.00545: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882353.00587: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882353.00674: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882353.02605: stdout chunk (state=3): >>>ansible-tmp-1726882352.9969528-19973-96360225530399=/root/.ansible/tmp/ansible-tmp-1726882352.9969528-19973-96360225530399 <<< 18699 1726882353.02749: stdout chunk (state=3): >>><<< 18699 1726882353.02773: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882353.02784: stderr chunk (state=3): >>><<< 18699 1726882353.02844: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882352.9969528-19973-96360225530399=/root/.ansible/tmp/ansible-tmp-1726882352.9969528-19973-96360225530399 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882353.02897: variable 'ansible_module_compression' from source: unknown 18699 1726882353.02990: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18699f6i6z5dg/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 18699 1726882353.03042: variable 'ansible_facts' from source: unknown 18699 1726882353.03598: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882352.9969528-19973-96360225530399/AnsiballZ_systemd.py 18699 1726882353.04118: Sending initial data 18699 1726882353.04162: Sent initial data (155 bytes) 18699 1726882353.04744: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882353.04801: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882353.04804: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882353.04911: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882353.04971: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882353.06467: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 18699 1726882353.06474: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18699 1726882353.06510: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18699 1726882353.06551: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmpo73tzb2z /root/.ansible/tmp/ansible-tmp-1726882352.9969528-19973-96360225530399/AnsiballZ_systemd.py <<< 18699 1726882353.06558: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882352.9969528-19973-96360225530399/AnsiballZ_systemd.py" <<< 18699 1726882353.06594: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 18699 1726882353.06601: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmpo73tzb2z" to remote "/root/.ansible/tmp/ansible-tmp-1726882352.9969528-19973-96360225530399/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882352.9969528-19973-96360225530399/AnsiballZ_systemd.py" <<< 18699 1726882353.07725: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882353.07764: stderr chunk (state=3): >>><<< 18699 1726882353.07767: stdout chunk (state=3): >>><<< 18699 1726882353.07799: done transferring module to remote 18699 1726882353.07819: _low_level_execute_command(): starting 18699 1726882353.07823: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882352.9969528-19973-96360225530399/ /root/.ansible/tmp/ansible-tmp-1726882352.9969528-19973-96360225530399/AnsiballZ_systemd.py && sleep 0' 18699 1726882353.08511: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882353.08526: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882353.08559: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882353.08563: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882353.08621: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882353.10354: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882353.10385: stderr chunk (state=3): >>><<< 18699 1726882353.10394: stdout chunk (state=3): >>><<< 18699 1726882353.10413: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882353.10417: _low_level_execute_command(): starting 18699 1726882353.10421: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882352.9969528-19973-96360225530399/AnsiballZ_systemd.py && sleep 0' 18699 1726882353.11000: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882353.11411: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882353.11497: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882353.40097: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "701", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainStartTimestampMonotonic": "18353430", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainHandoffTimestampMonotonic": "18368765", "ExecMainPID": "701", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10776576", "MemoryPeak": "14331904", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3304570880", "EffectiveMemoryMax": "3702886400", "EffectiveMemoryHigh": "3702886400", "CPUUsageNSec": "1269480000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpR<<< 18699 1726882353.40118: stdout chunk (state=3): >>>eceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target NetworkManager-wait-online.service network.target cloud-init.service shutdown.target", "After": "basic.<<< 18699 1726882353.40126: stdout chunk (state=3): >>>target system.slice sysinit.target systemd-journald.socket network-pre.target dbus-broker.service dbus.socket cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:37 EDT", "StateChangeTimestampMonotonic": "610814281", "InactiveExitTimestamp": "Fri 2024-09-20 21:19:45 EDT", "InactiveExitTimestampMonotonic": "18353817", "ActiveEnterTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ActiveEnterTimestampMonotonic": "18664782", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ConditionTimestampMonotonic": "18352589", "AssertTimestamp": "Fri 2024-09-20 21:19:45 EDT", "AssertTimestampMonotonic": "18352592", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ccc4619c603e4305b3d5044f460b1d5b", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 18699 1726882353.41820: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 18699 1726882353.41850: stderr chunk (state=3): >>><<< 18699 1726882353.41853: stdout chunk (state=3): >>><<< 18699 1726882353.41871: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "701", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainStartTimestampMonotonic": "18353430", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainHandoffTimestampMonotonic": "18368765", "ExecMainPID": "701", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10776576", "MemoryPeak": "14331904", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3304570880", "EffectiveMemoryMax": "3702886400", "EffectiveMemoryHigh": "3702886400", "CPUUsageNSec": "1269480000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target NetworkManager-wait-online.service network.target cloud-init.service shutdown.target", "After": "basic.target system.slice sysinit.target systemd-journald.socket network-pre.target dbus-broker.service dbus.socket cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:37 EDT", "StateChangeTimestampMonotonic": "610814281", "InactiveExitTimestamp": "Fri 2024-09-20 21:19:45 EDT", "InactiveExitTimestampMonotonic": "18353817", "ActiveEnterTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ActiveEnterTimestampMonotonic": "18664782", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ConditionTimestampMonotonic": "18352589", "AssertTimestamp": "Fri 2024-09-20 21:19:45 EDT", "AssertTimestampMonotonic": "18352592", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ccc4619c603e4305b3d5044f460b1d5b", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 18699 1726882353.42001: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882352.9969528-19973-96360225530399/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18699 1726882353.42032: _low_level_execute_command(): starting 18699 1726882353.42036: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882352.9969528-19973-96360225530399/ > /dev/null 2>&1 && sleep 0' 18699 1726882353.42561: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882353.42565: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 18699 1726882353.42567: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882353.42569: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882353.42571: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882353.42628: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882353.42632: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882353.42651: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882353.42703: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882353.44480: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882353.44511: stderr chunk (state=3): >>><<< 18699 1726882353.44514: stdout chunk (state=3): >>><<< 18699 1726882353.44543: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882353.44546: handler run complete 18699 1726882353.44643: attempt loop complete, returning result 18699 1726882353.44646: _execute() done 18699 1726882353.44648: dumping result to json 18699 1726882353.44650: done dumping result, returning 18699 1726882353.44672: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12673a56-9f93-1ce6-d207-000000000048] 18699 1726882353.44675: sending task result for task 12673a56-9f93-1ce6-d207-000000000048 18699 1726882353.44992: done sending task result for task 12673a56-9f93-1ce6-d207-000000000048 18699 1726882353.44998: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18699 1726882353.45054: no more pending results, returning what we have 18699 1726882353.45057: results queue empty 18699 1726882353.45057: checking for any_errors_fatal 18699 1726882353.45063: done checking for any_errors_fatal 18699 1726882353.45064: checking for max_fail_percentage 18699 1726882353.45065: done checking for max_fail_percentage 18699 1726882353.45066: checking to see if all hosts have failed and the running result is not ok 18699 1726882353.45067: done checking to see if all hosts have failed 18699 1726882353.45067: getting the remaining hosts for this loop 18699 1726882353.45069: done getting the remaining hosts for this loop 18699 1726882353.45072: getting the next task for host managed_node1 18699 1726882353.45077: done getting next task for host managed_node1 18699 1726882353.45081: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 18699 1726882353.45083: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882353.45091: getting variables 18699 1726882353.45095: in VariableManager get_vars() 18699 1726882353.45125: Calling all_inventory to load vars for managed_node1 18699 1726882353.45128: Calling groups_inventory to load vars for managed_node1 18699 1726882353.45130: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882353.45139: Calling all_plugins_play to load vars for managed_node1 18699 1726882353.45141: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882353.45144: Calling groups_plugins_play to load vars for managed_node1 18699 1726882353.46240: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882353.47702: done with get_vars() 18699 1726882353.47724: done getting variables 18699 1726882353.47782: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:32:33 -0400 (0:00:00.753) 0:00:27.074 ****** 18699 1726882353.47817: entering _queue_task() for managed_node1/service 18699 1726882353.48298: worker is 1 (out of 1 available) 18699 1726882353.48313: exiting _queue_task() for managed_node1/service 18699 1726882353.48324: done queuing things up, now waiting for results queue to drain 18699 1726882353.48325: waiting for pending results... 18699 1726882353.48538: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 18699 1726882353.48624: in run() - task 12673a56-9f93-1ce6-d207-000000000049 18699 1726882353.48635: variable 'ansible_search_path' from source: unknown 18699 1726882353.48638: variable 'ansible_search_path' from source: unknown 18699 1726882353.48676: calling self._execute() 18699 1726882353.48759: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882353.48764: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882353.48772: variable 'omit' from source: magic vars 18699 1726882353.49053: variable 'ansible_distribution_major_version' from source: facts 18699 1726882353.49062: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882353.49147: variable 'network_provider' from source: set_fact 18699 1726882353.49150: Evaluated conditional (network_provider == "nm"): True 18699 1726882353.49222: variable '__network_wpa_supplicant_required' from source: role '' defaults 18699 1726882353.49282: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 18699 1726882353.49399: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18699 1726882353.51379: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18699 1726882353.51383: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18699 1726882353.51385: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18699 1726882353.51387: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18699 1726882353.51389: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18699 1726882353.51444: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882353.51472: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882353.51500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882353.51535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882353.51549: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882353.51597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882353.51616: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882353.51642: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882353.51680: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882353.51702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882353.51735: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882353.51758: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882353.51780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882353.51820: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882353.51831: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882353.51962: variable 'network_connections' from source: play vars 18699 1726882353.51974: variable 'profile' from source: play vars 18699 1726882353.52042: variable 'profile' from source: play vars 18699 1726882353.52046: variable 'interface' from source: set_fact 18699 1726882353.52106: variable 'interface' from source: set_fact 18699 1726882353.52172: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18699 1726882353.52389: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18699 1726882353.52407: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18699 1726882353.52445: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18699 1726882353.52582: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18699 1726882353.52589: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18699 1726882353.52599: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18699 1726882353.52610: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882353.52613: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18699 1726882353.52670: variable '__network_wireless_connections_defined' from source: role '' defaults 18699 1726882353.52946: variable 'network_connections' from source: play vars 18699 1726882353.52950: variable 'profile' from source: play vars 18699 1726882353.53008: variable 'profile' from source: play vars 18699 1726882353.53012: variable 'interface' from source: set_fact 18699 1726882353.53089: variable 'interface' from source: set_fact 18699 1726882353.53120: Evaluated conditional (__network_wpa_supplicant_required): False 18699 1726882353.53123: when evaluation is False, skipping this task 18699 1726882353.53125: _execute() done 18699 1726882353.53140: dumping result to json 18699 1726882353.53199: done dumping result, returning 18699 1726882353.53212: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12673a56-9f93-1ce6-d207-000000000049] 18699 1726882353.53215: sending task result for task 12673a56-9f93-1ce6-d207-000000000049 18699 1726882353.53571: done sending task result for task 12673a56-9f93-1ce6-d207-000000000049 skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 18699 1726882353.53864: no more pending results, returning what we have 18699 1726882353.53892: results queue empty 18699 1726882353.53895: checking for any_errors_fatal 18699 1726882353.54010: done checking for any_errors_fatal 18699 1726882353.54012: checking for max_fail_percentage 18699 1726882353.54014: done checking for max_fail_percentage 18699 1726882353.54014: checking to see if all hosts have failed and the running result is not ok 18699 1726882353.54015: done checking to see if all hosts have failed 18699 1726882353.54016: getting the remaining hosts for this loop 18699 1726882353.54017: done getting the remaining hosts for this loop 18699 1726882353.54021: getting the next task for host managed_node1 18699 1726882353.54026: done getting next task for host managed_node1 18699 1726882353.54030: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 18699 1726882353.54032: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882353.54044: getting variables 18699 1726882353.54046: in VariableManager get_vars() 18699 1726882353.54079: Calling all_inventory to load vars for managed_node1 18699 1726882353.54082: Calling groups_inventory to load vars for managed_node1 18699 1726882353.54084: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882353.54145: Calling all_plugins_play to load vars for managed_node1 18699 1726882353.54149: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882353.54152: Calling groups_plugins_play to load vars for managed_node1 18699 1726882353.54707: WORKER PROCESS EXITING 18699 1726882353.55833: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882353.57808: done with get_vars() 18699 1726882353.57836: done getting variables 18699 1726882353.57908: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:32:33 -0400 (0:00:00.101) 0:00:27.175 ****** 18699 1726882353.57941: entering _queue_task() for managed_node1/service 18699 1726882353.58413: worker is 1 (out of 1 available) 18699 1726882353.58424: exiting _queue_task() for managed_node1/service 18699 1726882353.58435: done queuing things up, now waiting for results queue to drain 18699 1726882353.58436: waiting for pending results... 18699 1726882353.58724: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 18699 1726882353.58791: in run() - task 12673a56-9f93-1ce6-d207-00000000004a 18699 1726882353.58828: variable 'ansible_search_path' from source: unknown 18699 1726882353.58842: variable 'ansible_search_path' from source: unknown 18699 1726882353.58892: calling self._execute() 18699 1726882353.59016: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882353.59037: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882353.59055: variable 'omit' from source: magic vars 18699 1726882353.59564: variable 'ansible_distribution_major_version' from source: facts 18699 1726882353.59600: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882353.59790: variable 'network_provider' from source: set_fact 18699 1726882353.59795: Evaluated conditional (network_provider == "initscripts"): False 18699 1726882353.59798: when evaluation is False, skipping this task 18699 1726882353.59800: _execute() done 18699 1726882353.59803: dumping result to json 18699 1726882353.59805: done dumping result, returning 18699 1726882353.59807: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [12673a56-9f93-1ce6-d207-00000000004a] 18699 1726882353.59810: sending task result for task 12673a56-9f93-1ce6-d207-00000000004a skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18699 1726882353.60072: no more pending results, returning what we have 18699 1726882353.60075: results queue empty 18699 1726882353.60076: checking for any_errors_fatal 18699 1726882353.60085: done checking for any_errors_fatal 18699 1726882353.60087: checking for max_fail_percentage 18699 1726882353.60088: done checking for max_fail_percentage 18699 1726882353.60090: checking to see if all hosts have failed and the running result is not ok 18699 1726882353.60091: done checking to see if all hosts have failed 18699 1726882353.60091: getting the remaining hosts for this loop 18699 1726882353.60095: done getting the remaining hosts for this loop 18699 1726882353.60099: getting the next task for host managed_node1 18699 1726882353.60106: done getting next task for host managed_node1 18699 1726882353.60110: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 18699 1726882353.60113: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882353.60129: getting variables 18699 1726882353.60131: in VariableManager get_vars() 18699 1726882353.60169: Calling all_inventory to load vars for managed_node1 18699 1726882353.60172: Calling groups_inventory to load vars for managed_node1 18699 1726882353.60175: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882353.60187: Calling all_plugins_play to load vars for managed_node1 18699 1726882353.60190: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882353.60398: Calling groups_plugins_play to load vars for managed_node1 18699 1726882353.61109: done sending task result for task 12673a56-9f93-1ce6-d207-00000000004a 18699 1726882353.61112: WORKER PROCESS EXITING 18699 1726882353.61815: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882353.62916: done with get_vars() 18699 1726882353.62940: done getting variables 18699 1726882353.63001: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:32:33 -0400 (0:00:00.050) 0:00:27.226 ****** 18699 1726882353.63032: entering _queue_task() for managed_node1/copy 18699 1726882353.63428: worker is 1 (out of 1 available) 18699 1726882353.63441: exiting _queue_task() for managed_node1/copy 18699 1726882353.63452: done queuing things up, now waiting for results queue to drain 18699 1726882353.63453: waiting for pending results... 18699 1726882353.63671: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 18699 1726882353.63789: in run() - task 12673a56-9f93-1ce6-d207-00000000004b 18699 1726882353.63826: variable 'ansible_search_path' from source: unknown 18699 1726882353.63836: variable 'ansible_search_path' from source: unknown 18699 1726882353.63884: calling self._execute() 18699 1726882353.63984: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882353.64010: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882353.64027: variable 'omit' from source: magic vars 18699 1726882353.64549: variable 'ansible_distribution_major_version' from source: facts 18699 1726882353.64580: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882353.64686: variable 'network_provider' from source: set_fact 18699 1726882353.64702: Evaluated conditional (network_provider == "initscripts"): False 18699 1726882353.64709: when evaluation is False, skipping this task 18699 1726882353.64712: _execute() done 18699 1726882353.64715: dumping result to json 18699 1726882353.64717: done dumping result, returning 18699 1726882353.64726: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12673a56-9f93-1ce6-d207-00000000004b] 18699 1726882353.64729: sending task result for task 12673a56-9f93-1ce6-d207-00000000004b 18699 1726882353.64879: done sending task result for task 12673a56-9f93-1ce6-d207-00000000004b 18699 1726882353.64882: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 18699 1726882353.64936: no more pending results, returning what we have 18699 1726882353.64939: results queue empty 18699 1726882353.64941: checking for any_errors_fatal 18699 1726882353.64945: done checking for any_errors_fatal 18699 1726882353.64946: checking for max_fail_percentage 18699 1726882353.64948: done checking for max_fail_percentage 18699 1726882353.64949: checking to see if all hosts have failed and the running result is not ok 18699 1726882353.64949: done checking to see if all hosts have failed 18699 1726882353.64950: getting the remaining hosts for this loop 18699 1726882353.64951: done getting the remaining hosts for this loop 18699 1726882353.64955: getting the next task for host managed_node1 18699 1726882353.64960: done getting next task for host managed_node1 18699 1726882353.64963: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 18699 1726882353.64965: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882353.64981: getting variables 18699 1726882353.64982: in VariableManager get_vars() 18699 1726882353.65022: Calling all_inventory to load vars for managed_node1 18699 1726882353.65024: Calling groups_inventory to load vars for managed_node1 18699 1726882353.65026: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882353.65035: Calling all_plugins_play to load vars for managed_node1 18699 1726882353.65038: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882353.65040: Calling groups_plugins_play to load vars for managed_node1 18699 1726882353.66095: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882353.67689: done with get_vars() 18699 1726882353.67723: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:32:33 -0400 (0:00:00.047) 0:00:27.274 ****** 18699 1726882353.67825: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 18699 1726882353.68359: worker is 1 (out of 1 available) 18699 1726882353.68372: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 18699 1726882353.68384: done queuing things up, now waiting for results queue to drain 18699 1726882353.68385: waiting for pending results... 18699 1726882353.68763: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 18699 1726882353.68828: in run() - task 12673a56-9f93-1ce6-d207-00000000004c 18699 1726882353.68840: variable 'ansible_search_path' from source: unknown 18699 1726882353.68844: variable 'ansible_search_path' from source: unknown 18699 1726882353.68888: calling self._execute() 18699 1726882353.68978: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882353.68983: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882353.68992: variable 'omit' from source: magic vars 18699 1726882353.69399: variable 'ansible_distribution_major_version' from source: facts 18699 1726882353.69403: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882353.69405: variable 'omit' from source: magic vars 18699 1726882353.69406: variable 'omit' from source: magic vars 18699 1726882353.69557: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18699 1726882353.72324: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18699 1726882353.72402: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18699 1726882353.72444: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18699 1726882353.72483: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18699 1726882353.72520: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18699 1726882353.72612: variable 'network_provider' from source: set_fact 18699 1726882353.72717: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882353.72738: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882353.72759: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882353.72784: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882353.72799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882353.72851: variable 'omit' from source: magic vars 18699 1726882353.72933: variable 'omit' from source: magic vars 18699 1726882353.73006: variable 'network_connections' from source: play vars 18699 1726882353.73016: variable 'profile' from source: play vars 18699 1726882353.73063: variable 'profile' from source: play vars 18699 1726882353.73067: variable 'interface' from source: set_fact 18699 1726882353.73113: variable 'interface' from source: set_fact 18699 1726882353.73213: variable 'omit' from source: magic vars 18699 1726882353.73220: variable '__lsr_ansible_managed' from source: task vars 18699 1726882353.73264: variable '__lsr_ansible_managed' from source: task vars 18699 1726882353.73454: Loaded config def from plugin (lookup/template) 18699 1726882353.73457: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 18699 1726882353.73479: File lookup term: get_ansible_managed.j2 18699 1726882353.73482: variable 'ansible_search_path' from source: unknown 18699 1726882353.73487: evaluation_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 18699 1726882353.73501: search_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 18699 1726882353.73516: variable 'ansible_search_path' from source: unknown 18699 1726882353.79872: variable 'ansible_managed' from source: unknown 18699 1726882353.79953: variable 'omit' from source: magic vars 18699 1726882353.79980: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18699 1726882353.80009: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18699 1726882353.80030: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18699 1726882353.80046: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882353.80088: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882353.80100: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18699 1726882353.80103: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882353.80106: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882353.80201: Set connection var ansible_connection to ssh 18699 1726882353.80204: Set connection var ansible_pipelining to False 18699 1726882353.80206: Set connection var ansible_shell_executable to /bin/sh 18699 1726882353.80209: Set connection var ansible_timeout to 10 18699 1726882353.80211: Set connection var ansible_shell_type to sh 18699 1726882353.80213: Set connection var ansible_module_compression to ZIP_DEFLATED 18699 1726882353.80310: variable 'ansible_shell_executable' from source: unknown 18699 1726882353.80313: variable 'ansible_connection' from source: unknown 18699 1726882353.80316: variable 'ansible_module_compression' from source: unknown 18699 1726882353.80318: variable 'ansible_shell_type' from source: unknown 18699 1726882353.80320: variable 'ansible_shell_executable' from source: unknown 18699 1726882353.80322: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882353.80324: variable 'ansible_pipelining' from source: unknown 18699 1726882353.80326: variable 'ansible_timeout' from source: unknown 18699 1726882353.80328: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882353.80524: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18699 1726882353.80535: variable 'omit' from source: magic vars 18699 1726882353.80537: starting attempt loop 18699 1726882353.80540: running the handler 18699 1726882353.80543: _low_level_execute_command(): starting 18699 1726882353.80545: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18699 1726882353.81180: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882353.81208: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882353.81226: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882353.81239: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882353.81258: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882353.81334: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882353.82992: stdout chunk (state=3): >>>/root <<< 18699 1726882353.83253: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882353.83256: stdout chunk (state=3): >>><<< 18699 1726882353.83259: stderr chunk (state=3): >>><<< 18699 1726882353.83262: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882353.83264: _low_level_execute_command(): starting 18699 1726882353.83267: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882353.8316212-20015-218370984031078 `" && echo ansible-tmp-1726882353.8316212-20015-218370984031078="` echo /root/.ansible/tmp/ansible-tmp-1726882353.8316212-20015-218370984031078 `" ) && sleep 0' 18699 1726882353.83821: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882353.83839: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882353.83855: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882353.83888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18699 1726882353.83891: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 18699 1726882353.83945: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882353.84006: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882353.84034: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882353.84071: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882353.84123: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882353.85984: stdout chunk (state=3): >>>ansible-tmp-1726882353.8316212-20015-218370984031078=/root/.ansible/tmp/ansible-tmp-1726882353.8316212-20015-218370984031078 <<< 18699 1726882353.86148: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882353.86151: stdout chunk (state=3): >>><<< 18699 1726882353.86154: stderr chunk (state=3): >>><<< 18699 1726882353.86299: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882353.8316212-20015-218370984031078=/root/.ansible/tmp/ansible-tmp-1726882353.8316212-20015-218370984031078 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882353.86302: variable 'ansible_module_compression' from source: unknown 18699 1726882353.86305: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18699f6i6z5dg/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 18699 1726882353.86326: variable 'ansible_facts' from source: unknown 18699 1726882353.86447: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882353.8316212-20015-218370984031078/AnsiballZ_network_connections.py 18699 1726882353.86662: Sending initial data 18699 1726882353.86665: Sent initial data (168 bytes) 18699 1726882353.87249: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882353.87308: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882353.87388: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882353.87440: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882353.87551: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882353.89078: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18699 1726882353.89131: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18699 1726882353.89185: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmpch73d31_ /root/.ansible/tmp/ansible-tmp-1726882353.8316212-20015-218370984031078/AnsiballZ_network_connections.py <<< 18699 1726882353.89213: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882353.8316212-20015-218370984031078/AnsiballZ_network_connections.py" <<< 18699 1726882353.89259: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmpch73d31_" to remote "/root/.ansible/tmp/ansible-tmp-1726882353.8316212-20015-218370984031078/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882353.8316212-20015-218370984031078/AnsiballZ_network_connections.py" <<< 18699 1726882353.91704: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882353.91708: stdout chunk (state=3): >>><<< 18699 1726882353.91711: stderr chunk (state=3): >>><<< 18699 1726882353.91713: done transferring module to remote 18699 1726882353.91715: _low_level_execute_command(): starting 18699 1726882353.91718: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882353.8316212-20015-218370984031078/ /root/.ansible/tmp/ansible-tmp-1726882353.8316212-20015-218370984031078/AnsiballZ_network_connections.py && sleep 0' 18699 1726882353.92723: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882353.92747: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882353.92877: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882353.92880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882353.92910: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882353.92946: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882353.92994: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882353.94759: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882353.94833: stdout chunk (state=3): >>><<< 18699 1726882353.94837: stderr chunk (state=3): >>><<< 18699 1726882353.94863: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882353.94967: _low_level_execute_command(): starting 18699 1726882353.94971: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882353.8316212-20015-218370984031078/AnsiballZ_network_connections.py && sleep 0' 18699 1726882353.95667: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882353.95682: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882353.95708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18699 1726882353.95732: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882353.95812: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882353.95831: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882353.95865: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882353.95886: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882353.95977: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882354.25457: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 18699 1726882354.27503: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 18699 1726882354.27507: stdout chunk (state=3): >>><<< 18699 1726882354.27509: stderr chunk (state=3): >>><<< 18699 1726882354.27512: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 18699 1726882354.27514: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'lsr27', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882353.8316212-20015-218370984031078/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18699 1726882354.27518: _low_level_execute_command(): starting 18699 1726882354.27520: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882353.8316212-20015-218370984031078/ > /dev/null 2>&1 && sleep 0' 18699 1726882354.28149: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882354.28157: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882354.28168: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882354.28183: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18699 1726882354.28211: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 18699 1726882354.28217: stderr chunk (state=3): >>>debug2: match not found <<< 18699 1726882354.28228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882354.28243: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18699 1726882354.28319: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882354.28347: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882354.28361: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882354.28378: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882354.28446: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882354.30401: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882354.30405: stdout chunk (state=3): >>><<< 18699 1726882354.30407: stderr chunk (state=3): >>><<< 18699 1726882354.30410: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882354.30412: handler run complete 18699 1726882354.30414: attempt loop complete, returning result 18699 1726882354.30416: _execute() done 18699 1726882354.30801: dumping result to json 18699 1726882354.30805: done dumping result, returning 18699 1726882354.30807: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12673a56-9f93-1ce6-d207-00000000004c] 18699 1726882354.30809: sending task result for task 12673a56-9f93-1ce6-d207-00000000004c 18699 1726882354.30891: done sending task result for task 12673a56-9f93-1ce6-d207-00000000004c 18699 1726882354.30899: WORKER PROCESS EXITING changed: [managed_node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "lsr27", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 18699 1726882354.30989: no more pending results, returning what we have 18699 1726882354.30992: results queue empty 18699 1726882354.30997: checking for any_errors_fatal 18699 1726882354.31003: done checking for any_errors_fatal 18699 1726882354.31004: checking for max_fail_percentage 18699 1726882354.31006: done checking for max_fail_percentage 18699 1726882354.31007: checking to see if all hosts have failed and the running result is not ok 18699 1726882354.31012: done checking to see if all hosts have failed 18699 1726882354.31013: getting the remaining hosts for this loop 18699 1726882354.31014: done getting the remaining hosts for this loop 18699 1726882354.31017: getting the next task for host managed_node1 18699 1726882354.31022: done getting next task for host managed_node1 18699 1726882354.31025: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 18699 1726882354.31027: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882354.31036: getting variables 18699 1726882354.31037: in VariableManager get_vars() 18699 1726882354.31070: Calling all_inventory to load vars for managed_node1 18699 1726882354.31072: Calling groups_inventory to load vars for managed_node1 18699 1726882354.31074: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882354.31082: Calling all_plugins_play to load vars for managed_node1 18699 1726882354.31085: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882354.31087: Calling groups_plugins_play to load vars for managed_node1 18699 1726882354.32962: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882354.35035: done with get_vars() 18699 1726882354.35060: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:32:34 -0400 (0:00:00.674) 0:00:27.948 ****** 18699 1726882354.35265: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 18699 1726882354.36186: worker is 1 (out of 1 available) 18699 1726882354.36214: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 18699 1726882354.36227: done queuing things up, now waiting for results queue to drain 18699 1726882354.36228: waiting for pending results... 18699 1726882354.36714: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 18699 1726882354.36931: in run() - task 12673a56-9f93-1ce6-d207-00000000004d 18699 1726882354.37100: variable 'ansible_search_path' from source: unknown 18699 1726882354.37103: variable 'ansible_search_path' from source: unknown 18699 1726882354.37106: calling self._execute() 18699 1726882354.37217: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882354.37483: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882354.37486: variable 'omit' from source: magic vars 18699 1726882354.38127: variable 'ansible_distribution_major_version' from source: facts 18699 1726882354.38263: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882354.38468: variable 'network_state' from source: role '' defaults 18699 1726882354.38516: Evaluated conditional (network_state != {}): False 18699 1726882354.38798: when evaluation is False, skipping this task 18699 1726882354.38803: _execute() done 18699 1726882354.38807: dumping result to json 18699 1726882354.38810: done dumping result, returning 18699 1726882354.38815: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [12673a56-9f93-1ce6-d207-00000000004d] 18699 1726882354.38818: sending task result for task 12673a56-9f93-1ce6-d207-00000000004d 18699 1726882354.38879: done sending task result for task 12673a56-9f93-1ce6-d207-00000000004d 18699 1726882354.38882: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 18699 1726882354.39130: no more pending results, returning what we have 18699 1726882354.39134: results queue empty 18699 1726882354.39135: checking for any_errors_fatal 18699 1726882354.39143: done checking for any_errors_fatal 18699 1726882354.39144: checking for max_fail_percentage 18699 1726882354.39146: done checking for max_fail_percentage 18699 1726882354.39146: checking to see if all hosts have failed and the running result is not ok 18699 1726882354.39147: done checking to see if all hosts have failed 18699 1726882354.39148: getting the remaining hosts for this loop 18699 1726882354.39149: done getting the remaining hosts for this loop 18699 1726882354.39153: getting the next task for host managed_node1 18699 1726882354.39157: done getting next task for host managed_node1 18699 1726882354.39161: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 18699 1726882354.39164: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882354.39177: getting variables 18699 1726882354.39179: in VariableManager get_vars() 18699 1726882354.39218: Calling all_inventory to load vars for managed_node1 18699 1726882354.39222: Calling groups_inventory to load vars for managed_node1 18699 1726882354.39224: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882354.39234: Calling all_plugins_play to load vars for managed_node1 18699 1726882354.39237: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882354.39239: Calling groups_plugins_play to load vars for managed_node1 18699 1726882354.40797: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882354.43565: done with get_vars() 18699 1726882354.43595: done getting variables 18699 1726882354.43669: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:32:34 -0400 (0:00:00.084) 0:00:28.033 ****** 18699 1726882354.43705: entering _queue_task() for managed_node1/debug 18699 1726882354.44086: worker is 1 (out of 1 available) 18699 1726882354.44106: exiting _queue_task() for managed_node1/debug 18699 1726882354.44120: done queuing things up, now waiting for results queue to drain 18699 1726882354.44121: waiting for pending results... 18699 1726882354.44403: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 18699 1726882354.44514: in run() - task 12673a56-9f93-1ce6-d207-00000000004e 18699 1726882354.44536: variable 'ansible_search_path' from source: unknown 18699 1726882354.44544: variable 'ansible_search_path' from source: unknown 18699 1726882354.44601: calling self._execute() 18699 1726882354.44802: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882354.44806: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882354.44809: variable 'omit' from source: magic vars 18699 1726882354.45153: variable 'ansible_distribution_major_version' from source: facts 18699 1726882354.45171: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882354.45203: variable 'omit' from source: magic vars 18699 1726882354.45263: variable 'omit' from source: magic vars 18699 1726882354.45362: variable 'omit' from source: magic vars 18699 1726882354.45370: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18699 1726882354.45414: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18699 1726882354.45439: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18699 1726882354.45462: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882354.45498: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882354.45581: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18699 1726882354.45585: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882354.45587: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882354.45659: Set connection var ansible_connection to ssh 18699 1726882354.45802: Set connection var ansible_pipelining to False 18699 1726882354.45805: Set connection var ansible_shell_executable to /bin/sh 18699 1726882354.45807: Set connection var ansible_timeout to 10 18699 1726882354.45809: Set connection var ansible_shell_type to sh 18699 1726882354.45811: Set connection var ansible_module_compression to ZIP_DEFLATED 18699 1726882354.45812: variable 'ansible_shell_executable' from source: unknown 18699 1726882354.45814: variable 'ansible_connection' from source: unknown 18699 1726882354.45816: variable 'ansible_module_compression' from source: unknown 18699 1726882354.45818: variable 'ansible_shell_type' from source: unknown 18699 1726882354.45820: variable 'ansible_shell_executable' from source: unknown 18699 1726882354.45821: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882354.45823: variable 'ansible_pipelining' from source: unknown 18699 1726882354.45825: variable 'ansible_timeout' from source: unknown 18699 1726882354.45826: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882354.46112: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18699 1726882354.46116: variable 'omit' from source: magic vars 18699 1726882354.46118: starting attempt loop 18699 1726882354.46120: running the handler 18699 1726882354.46208: variable '__network_connections_result' from source: set_fact 18699 1726882354.46268: handler run complete 18699 1726882354.46291: attempt loop complete, returning result 18699 1726882354.46303: _execute() done 18699 1726882354.46310: dumping result to json 18699 1726882354.46316: done dumping result, returning 18699 1726882354.46332: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12673a56-9f93-1ce6-d207-00000000004e] 18699 1726882354.46341: sending task result for task 12673a56-9f93-1ce6-d207-00000000004e 18699 1726882354.46446: done sending task result for task 12673a56-9f93-1ce6-d207-00000000004e 18699 1726882354.46453: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result.stderr_lines": [ "" ] } 18699 1726882354.46550: no more pending results, returning what we have 18699 1726882354.46553: results queue empty 18699 1726882354.46554: checking for any_errors_fatal 18699 1726882354.46560: done checking for any_errors_fatal 18699 1726882354.46561: checking for max_fail_percentage 18699 1726882354.46562: done checking for max_fail_percentage 18699 1726882354.46563: checking to see if all hosts have failed and the running result is not ok 18699 1726882354.46564: done checking to see if all hosts have failed 18699 1726882354.46564: getting the remaining hosts for this loop 18699 1726882354.46566: done getting the remaining hosts for this loop 18699 1726882354.46570: getting the next task for host managed_node1 18699 1726882354.46576: done getting next task for host managed_node1 18699 1726882354.46579: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 18699 1726882354.46581: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882354.46590: getting variables 18699 1726882354.46591: in VariableManager get_vars() 18699 1726882354.46626: Calling all_inventory to load vars for managed_node1 18699 1726882354.46629: Calling groups_inventory to load vars for managed_node1 18699 1726882354.46631: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882354.46640: Calling all_plugins_play to load vars for managed_node1 18699 1726882354.46643: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882354.46645: Calling groups_plugins_play to load vars for managed_node1 18699 1726882354.48570: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882354.50691: done with get_vars() 18699 1726882354.50718: done getting variables 18699 1726882354.50783: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:32:34 -0400 (0:00:00.071) 0:00:28.104 ****** 18699 1726882354.50820: entering _queue_task() for managed_node1/debug 18699 1726882354.51439: worker is 1 (out of 1 available) 18699 1726882354.51567: exiting _queue_task() for managed_node1/debug 18699 1726882354.51579: done queuing things up, now waiting for results queue to drain 18699 1726882354.51580: waiting for pending results... 18699 1726882354.52057: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 18699 1726882354.52189: in run() - task 12673a56-9f93-1ce6-d207-00000000004f 18699 1726882354.52312: variable 'ansible_search_path' from source: unknown 18699 1726882354.52316: variable 'ansible_search_path' from source: unknown 18699 1726882354.52318: calling self._execute() 18699 1726882354.52372: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882354.52385: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882354.52406: variable 'omit' from source: magic vars 18699 1726882354.52805: variable 'ansible_distribution_major_version' from source: facts 18699 1726882354.52820: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882354.52833: variable 'omit' from source: magic vars 18699 1726882354.52878: variable 'omit' from source: magic vars 18699 1726882354.52919: variable 'omit' from source: magic vars 18699 1726882354.52964: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18699 1726882354.53004: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18699 1726882354.53029: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18699 1726882354.53050: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882354.53068: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882354.53106: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18699 1726882354.53113: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882354.53119: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882354.53302: Set connection var ansible_connection to ssh 18699 1726882354.53305: Set connection var ansible_pipelining to False 18699 1726882354.53307: Set connection var ansible_shell_executable to /bin/sh 18699 1726882354.53308: Set connection var ansible_timeout to 10 18699 1726882354.53310: Set connection var ansible_shell_type to sh 18699 1726882354.53312: Set connection var ansible_module_compression to ZIP_DEFLATED 18699 1726882354.53313: variable 'ansible_shell_executable' from source: unknown 18699 1726882354.53315: variable 'ansible_connection' from source: unknown 18699 1726882354.53317: variable 'ansible_module_compression' from source: unknown 18699 1726882354.53319: variable 'ansible_shell_type' from source: unknown 18699 1726882354.53320: variable 'ansible_shell_executable' from source: unknown 18699 1726882354.53322: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882354.53323: variable 'ansible_pipelining' from source: unknown 18699 1726882354.53325: variable 'ansible_timeout' from source: unknown 18699 1726882354.53327: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882354.53855: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18699 1726882354.53859: variable 'omit' from source: magic vars 18699 1726882354.53861: starting attempt loop 18699 1726882354.53864: running the handler 18699 1726882354.53867: variable '__network_connections_result' from source: set_fact 18699 1726882354.53869: variable '__network_connections_result' from source: set_fact 18699 1726882354.54082: handler run complete 18699 1726882354.54303: attempt loop complete, returning result 18699 1726882354.54306: _execute() done 18699 1726882354.54309: dumping result to json 18699 1726882354.54311: done dumping result, returning 18699 1726882354.54314: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12673a56-9f93-1ce6-d207-00000000004f] 18699 1726882354.54316: sending task result for task 12673a56-9f93-1ce6-d207-00000000004f ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "lsr27", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 18699 1726882354.54486: no more pending results, returning what we have 18699 1726882354.54490: results queue empty 18699 1726882354.54491: checking for any_errors_fatal 18699 1726882354.54502: done checking for any_errors_fatal 18699 1726882354.54503: checking for max_fail_percentage 18699 1726882354.54505: done checking for max_fail_percentage 18699 1726882354.54505: checking to see if all hosts have failed and the running result is not ok 18699 1726882354.54506: done checking to see if all hosts have failed 18699 1726882354.54507: getting the remaining hosts for this loop 18699 1726882354.54508: done getting the remaining hosts for this loop 18699 1726882354.54512: getting the next task for host managed_node1 18699 1726882354.54519: done getting next task for host managed_node1 18699 1726882354.54522: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 18699 1726882354.54524: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882354.54535: getting variables 18699 1726882354.54537: in VariableManager get_vars() 18699 1726882354.54572: Calling all_inventory to load vars for managed_node1 18699 1726882354.54575: Calling groups_inventory to load vars for managed_node1 18699 1726882354.54578: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882354.54587: Calling all_plugins_play to load vars for managed_node1 18699 1726882354.54590: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882354.55017: Calling groups_plugins_play to load vars for managed_node1 18699 1726882354.55557: done sending task result for task 12673a56-9f93-1ce6-d207-00000000004f 18699 1726882354.55561: WORKER PROCESS EXITING 18699 1726882354.56730: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882354.60162: done with get_vars() 18699 1726882354.60186: done getting variables 18699 1726882354.60316: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:32:34 -0400 (0:00:00.095) 0:00:28.199 ****** 18699 1726882354.60473: entering _queue_task() for managed_node1/debug 18699 1726882354.61169: worker is 1 (out of 1 available) 18699 1726882354.61183: exiting _queue_task() for managed_node1/debug 18699 1726882354.61197: done queuing things up, now waiting for results queue to drain 18699 1726882354.61198: waiting for pending results... 18699 1726882354.61804: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 18699 1726882354.61937: in run() - task 12673a56-9f93-1ce6-d207-000000000050 18699 1726882354.61960: variable 'ansible_search_path' from source: unknown 18699 1726882354.61972: variable 'ansible_search_path' from source: unknown 18699 1726882354.62033: calling self._execute() 18699 1726882354.62141: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882354.62153: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882354.62170: variable 'omit' from source: magic vars 18699 1726882354.62814: variable 'ansible_distribution_major_version' from source: facts 18699 1726882354.62825: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882354.62954: variable 'network_state' from source: role '' defaults 18699 1726882354.62965: Evaluated conditional (network_state != {}): False 18699 1726882354.62969: when evaluation is False, skipping this task 18699 1726882354.62979: _execute() done 18699 1726882354.62982: dumping result to json 18699 1726882354.62987: done dumping result, returning 18699 1726882354.62999: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12673a56-9f93-1ce6-d207-000000000050] 18699 1726882354.63003: sending task result for task 12673a56-9f93-1ce6-d207-000000000050 skipping: [managed_node1] => { "false_condition": "network_state != {}" } 18699 1726882354.63133: no more pending results, returning what we have 18699 1726882354.63136: results queue empty 18699 1726882354.63138: checking for any_errors_fatal 18699 1726882354.63146: done checking for any_errors_fatal 18699 1726882354.63147: checking for max_fail_percentage 18699 1726882354.63149: done checking for max_fail_percentage 18699 1726882354.63150: checking to see if all hosts have failed and the running result is not ok 18699 1726882354.63151: done checking to see if all hosts have failed 18699 1726882354.63151: getting the remaining hosts for this loop 18699 1726882354.63153: done getting the remaining hosts for this loop 18699 1726882354.63157: getting the next task for host managed_node1 18699 1726882354.63164: done getting next task for host managed_node1 18699 1726882354.63168: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 18699 1726882354.63170: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882354.63185: getting variables 18699 1726882354.63298: in VariableManager get_vars() 18699 1726882354.63332: Calling all_inventory to load vars for managed_node1 18699 1726882354.63334: Calling groups_inventory to load vars for managed_node1 18699 1726882354.63336: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882354.63349: Calling all_plugins_play to load vars for managed_node1 18699 1726882354.63351: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882354.63355: Calling groups_plugins_play to load vars for managed_node1 18699 1726882354.63872: done sending task result for task 12673a56-9f93-1ce6-d207-000000000050 18699 1726882354.63876: WORKER PROCESS EXITING 18699 1726882354.65515: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882354.67240: done with get_vars() 18699 1726882354.67269: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:32:34 -0400 (0:00:00.070) 0:00:28.269 ****** 18699 1726882354.67377: entering _queue_task() for managed_node1/ping 18699 1726882354.67855: worker is 1 (out of 1 available) 18699 1726882354.67868: exiting _queue_task() for managed_node1/ping 18699 1726882354.67882: done queuing things up, now waiting for results queue to drain 18699 1726882354.67884: waiting for pending results... 18699 1726882354.68191: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 18699 1726882354.68283: in run() - task 12673a56-9f93-1ce6-d207-000000000051 18699 1726882354.68301: variable 'ansible_search_path' from source: unknown 18699 1726882354.68305: variable 'ansible_search_path' from source: unknown 18699 1726882354.68429: calling self._execute() 18699 1726882354.68446: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882354.68451: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882354.68462: variable 'omit' from source: magic vars 18699 1726882354.68886: variable 'ansible_distribution_major_version' from source: facts 18699 1726882354.68890: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882354.68892: variable 'omit' from source: magic vars 18699 1726882354.68929: variable 'omit' from source: magic vars 18699 1726882354.68964: variable 'omit' from source: magic vars 18699 1726882354.69501: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18699 1726882354.69505: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18699 1726882354.69508: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18699 1726882354.69511: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882354.69513: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882354.69515: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18699 1726882354.69517: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882354.69520: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882354.69522: Set connection var ansible_connection to ssh 18699 1726882354.69525: Set connection var ansible_pipelining to False 18699 1726882354.69528: Set connection var ansible_shell_executable to /bin/sh 18699 1726882354.69530: Set connection var ansible_timeout to 10 18699 1726882354.69532: Set connection var ansible_shell_type to sh 18699 1726882354.69535: Set connection var ansible_module_compression to ZIP_DEFLATED 18699 1726882354.69538: variable 'ansible_shell_executable' from source: unknown 18699 1726882354.69540: variable 'ansible_connection' from source: unknown 18699 1726882354.69543: variable 'ansible_module_compression' from source: unknown 18699 1726882354.69546: variable 'ansible_shell_type' from source: unknown 18699 1726882354.69548: variable 'ansible_shell_executable' from source: unknown 18699 1726882354.69551: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882354.69553: variable 'ansible_pipelining' from source: unknown 18699 1726882354.69556: variable 'ansible_timeout' from source: unknown 18699 1726882354.69558: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882354.69571: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18699 1726882354.69575: variable 'omit' from source: magic vars 18699 1726882354.69577: starting attempt loop 18699 1726882354.69579: running the handler 18699 1726882354.69599: _low_level_execute_command(): starting 18699 1726882354.69602: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18699 1726882354.70449: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882354.70474: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882354.70486: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882354.70514: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882354.70632: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882354.72314: stdout chunk (state=3): >>>/root <<< 18699 1726882354.72591: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882354.72601: stdout chunk (state=3): >>><<< 18699 1726882354.72604: stderr chunk (state=3): >>><<< 18699 1726882354.72626: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882354.72649: _low_level_execute_command(): starting 18699 1726882354.72674: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882354.7263436-20080-170683627620792 `" && echo ansible-tmp-1726882354.7263436-20080-170683627620792="` echo /root/.ansible/tmp/ansible-tmp-1726882354.7263436-20080-170683627620792 `" ) && sleep 0' 18699 1726882354.74032: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882354.74141: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882354.74145: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 18699 1726882354.74148: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 18699 1726882354.74157: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882354.74160: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 18699 1726882354.74162: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882354.74250: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882354.74253: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882354.74286: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882354.74451: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882354.76203: stdout chunk (state=3): >>>ansible-tmp-1726882354.7263436-20080-170683627620792=/root/.ansible/tmp/ansible-tmp-1726882354.7263436-20080-170683627620792 <<< 18699 1726882354.76368: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882354.76371: stdout chunk (state=3): >>><<< 18699 1726882354.76374: stderr chunk (state=3): >>><<< 18699 1726882354.76504: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882354.7263436-20080-170683627620792=/root/.ansible/tmp/ansible-tmp-1726882354.7263436-20080-170683627620792 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882354.76508: variable 'ansible_module_compression' from source: unknown 18699 1726882354.76526: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18699f6i6z5dg/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 18699 1726882354.76579: variable 'ansible_facts' from source: unknown 18699 1726882354.76754: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882354.7263436-20080-170683627620792/AnsiballZ_ping.py 18699 1726882354.76971: Sending initial data 18699 1726882354.77071: Sent initial data (153 bytes) 18699 1726882354.77662: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882354.77713: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882354.77787: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882354.77807: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882354.77829: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882354.78031: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882354.79542: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18699 1726882354.79577: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18699 1726882354.79634: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmpb7ysvi0r /root/.ansible/tmp/ansible-tmp-1726882354.7263436-20080-170683627620792/AnsiballZ_ping.py <<< 18699 1726882354.79637: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882354.7263436-20080-170683627620792/AnsiballZ_ping.py" <<< 18699 1726882354.79689: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmpb7ysvi0r" to remote "/root/.ansible/tmp/ansible-tmp-1726882354.7263436-20080-170683627620792/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882354.7263436-20080-170683627620792/AnsiballZ_ping.py" <<< 18699 1726882354.81109: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882354.81113: stderr chunk (state=3): >>><<< 18699 1726882354.81115: stdout chunk (state=3): >>><<< 18699 1726882354.81117: done transferring module to remote 18699 1726882354.81119: _low_level_execute_command(): starting 18699 1726882354.81122: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882354.7263436-20080-170683627620792/ /root/.ansible/tmp/ansible-tmp-1726882354.7263436-20080-170683627620792/AnsiballZ_ping.py && sleep 0' 18699 1726882354.81919: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882354.82041: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882354.82078: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882354.82097: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882354.82192: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882354.82261: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882354.84128: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882354.84155: stdout chunk (state=3): >>><<< 18699 1726882354.84158: stderr chunk (state=3): >>><<< 18699 1726882354.84299: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882354.84303: _low_level_execute_command(): starting 18699 1726882354.84307: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882354.7263436-20080-170683627620792/AnsiballZ_ping.py && sleep 0' 18699 1726882354.85415: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882354.85570: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882354.85614: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882354.85701: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882355.00722: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 18699 1726882355.01837: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882355.01853: stderr chunk (state=3): >>>Shared connection to 10.31.9.159 closed. <<< 18699 1726882355.01913: stderr chunk (state=3): >>><<< 18699 1726882355.01951: stdout chunk (state=3): >>><<< 18699 1726882355.01979: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 18699 1726882355.02018: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882354.7263436-20080-170683627620792/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18699 1726882355.02034: _low_level_execute_command(): starting 18699 1726882355.02043: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882354.7263436-20080-170683627620792/ > /dev/null 2>&1 && sleep 0' 18699 1726882355.02723: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882355.02739: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882355.02753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882355.02781: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18699 1726882355.02887: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882355.02901: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882355.02920: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882355.02999: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882355.04823: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882355.04884: stderr chunk (state=3): >>><<< 18699 1726882355.04970: stdout chunk (state=3): >>><<< 18699 1726882355.04974: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882355.04982: handler run complete 18699 1726882355.04985: attempt loop complete, returning result 18699 1726882355.04987: _execute() done 18699 1726882355.04990: dumping result to json 18699 1726882355.04992: done dumping result, returning 18699 1726882355.05009: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [12673a56-9f93-1ce6-d207-000000000051] 18699 1726882355.05022: sending task result for task 12673a56-9f93-1ce6-d207-000000000051 18699 1726882355.05179: done sending task result for task 12673a56-9f93-1ce6-d207-000000000051 18699 1726882355.05183: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "ping": "pong" } 18699 1726882355.05243: no more pending results, returning what we have 18699 1726882355.05245: results queue empty 18699 1726882355.05246: checking for any_errors_fatal 18699 1726882355.05254: done checking for any_errors_fatal 18699 1726882355.05255: checking for max_fail_percentage 18699 1726882355.05257: done checking for max_fail_percentage 18699 1726882355.05258: checking to see if all hosts have failed and the running result is not ok 18699 1726882355.05258: done checking to see if all hosts have failed 18699 1726882355.05259: getting the remaining hosts for this loop 18699 1726882355.05260: done getting the remaining hosts for this loop 18699 1726882355.05264: getting the next task for host managed_node1 18699 1726882355.05272: done getting next task for host managed_node1 18699 1726882355.05274: ^ task is: TASK: meta (role_complete) 18699 1726882355.05275: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882355.05284: getting variables 18699 1726882355.05286: in VariableManager get_vars() 18699 1726882355.05326: Calling all_inventory to load vars for managed_node1 18699 1726882355.05329: Calling groups_inventory to load vars for managed_node1 18699 1726882355.05331: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882355.05341: Calling all_plugins_play to load vars for managed_node1 18699 1726882355.05344: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882355.05346: Calling groups_plugins_play to load vars for managed_node1 18699 1726882355.07828: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882355.10559: done with get_vars() 18699 1726882355.10588: done getting variables 18699 1726882355.10868: done queuing things up, now waiting for results queue to drain 18699 1726882355.10870: results queue empty 18699 1726882355.10871: checking for any_errors_fatal 18699 1726882355.10874: done checking for any_errors_fatal 18699 1726882355.10874: checking for max_fail_percentage 18699 1726882355.10875: done checking for max_fail_percentage 18699 1726882355.10876: checking to see if all hosts have failed and the running result is not ok 18699 1726882355.10877: done checking to see if all hosts have failed 18699 1726882355.10878: getting the remaining hosts for this loop 18699 1726882355.10878: done getting the remaining hosts for this loop 18699 1726882355.10881: getting the next task for host managed_node1 18699 1726882355.10885: done getting next task for host managed_node1 18699 1726882355.10887: ^ task is: TASK: meta (flush_handlers) 18699 1726882355.10889: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882355.10892: getting variables 18699 1726882355.10894: in VariableManager get_vars() 18699 1726882355.10974: Calling all_inventory to load vars for managed_node1 18699 1726882355.10976: Calling groups_inventory to load vars for managed_node1 18699 1726882355.10979: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882355.10983: Calling all_plugins_play to load vars for managed_node1 18699 1726882355.10986: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882355.10989: Calling groups_plugins_play to load vars for managed_node1 18699 1726882355.13324: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882355.16663: done with get_vars() 18699 1726882355.16692: done getting variables 18699 1726882355.16900: in VariableManager get_vars() 18699 1726882355.16914: Calling all_inventory to load vars for managed_node1 18699 1726882355.16916: Calling groups_inventory to load vars for managed_node1 18699 1726882355.16918: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882355.16923: Calling all_plugins_play to load vars for managed_node1 18699 1726882355.16925: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882355.16928: Calling groups_plugins_play to load vars for managed_node1 18699 1726882355.19608: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882355.22905: done with get_vars() 18699 1726882355.23055: done queuing things up, now waiting for results queue to drain 18699 1726882355.23057: results queue empty 18699 1726882355.23058: checking for any_errors_fatal 18699 1726882355.23059: done checking for any_errors_fatal 18699 1726882355.23060: checking for max_fail_percentage 18699 1726882355.23061: done checking for max_fail_percentage 18699 1726882355.23062: checking to see if all hosts have failed and the running result is not ok 18699 1726882355.23062: done checking to see if all hosts have failed 18699 1726882355.23063: getting the remaining hosts for this loop 18699 1726882355.23064: done getting the remaining hosts for this loop 18699 1726882355.23067: getting the next task for host managed_node1 18699 1726882355.23070: done getting next task for host managed_node1 18699 1726882355.23072: ^ task is: TASK: meta (flush_handlers) 18699 1726882355.23073: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882355.23076: getting variables 18699 1726882355.23077: in VariableManager get_vars() 18699 1726882355.23087: Calling all_inventory to load vars for managed_node1 18699 1726882355.23089: Calling groups_inventory to load vars for managed_node1 18699 1726882355.23091: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882355.23157: Calling all_plugins_play to load vars for managed_node1 18699 1726882355.23159: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882355.23162: Calling groups_plugins_play to load vars for managed_node1 18699 1726882355.25378: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882355.28778: done with get_vars() 18699 1726882355.28806: done getting variables 18699 1726882355.28863: in VariableManager get_vars() 18699 1726882355.28876: Calling all_inventory to load vars for managed_node1 18699 1726882355.28878: Calling groups_inventory to load vars for managed_node1 18699 1726882355.28880: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882355.28885: Calling all_plugins_play to load vars for managed_node1 18699 1726882355.28888: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882355.28890: Calling groups_plugins_play to load vars for managed_node1 18699 1726882355.30852: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882355.32430: done with get_vars() 18699 1726882355.32459: done queuing things up, now waiting for results queue to drain 18699 1726882355.32461: results queue empty 18699 1726882355.32462: checking for any_errors_fatal 18699 1726882355.32463: done checking for any_errors_fatal 18699 1726882355.32463: checking for max_fail_percentage 18699 1726882355.32464: done checking for max_fail_percentage 18699 1726882355.32465: checking to see if all hosts have failed and the running result is not ok 18699 1726882355.32466: done checking to see if all hosts have failed 18699 1726882355.32466: getting the remaining hosts for this loop 18699 1726882355.32499: done getting the remaining hosts for this loop 18699 1726882355.32504: getting the next task for host managed_node1 18699 1726882355.32507: done getting next task for host managed_node1 18699 1726882355.32508: ^ task is: None 18699 1726882355.32510: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882355.32511: done queuing things up, now waiting for results queue to drain 18699 1726882355.32512: results queue empty 18699 1726882355.32513: checking for any_errors_fatal 18699 1726882355.32513: done checking for any_errors_fatal 18699 1726882355.32514: checking for max_fail_percentage 18699 1726882355.32515: done checking for max_fail_percentage 18699 1726882355.32515: checking to see if all hosts have failed and the running result is not ok 18699 1726882355.32516: done checking to see if all hosts have failed 18699 1726882355.32517: getting the next task for host managed_node1 18699 1726882355.32519: done getting next task for host managed_node1 18699 1726882355.32520: ^ task is: None 18699 1726882355.32521: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882355.32560: in VariableManager get_vars() 18699 1726882355.32576: done with get_vars() 18699 1726882355.32580: in VariableManager get_vars() 18699 1726882355.32588: done with get_vars() 18699 1726882355.32595: variable 'omit' from source: magic vars 18699 1726882355.32625: in VariableManager get_vars() 18699 1726882355.32634: done with get_vars() 18699 1726882355.32652: variable 'omit' from source: magic vars PLAY [Delete the interface] **************************************************** 18699 1726882355.32911: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 18699 1726882355.33100: getting the remaining hosts for this loop 18699 1726882355.33102: done getting the remaining hosts for this loop 18699 1726882355.33104: getting the next task for host managed_node1 18699 1726882355.33106: done getting next task for host managed_node1 18699 1726882355.33108: ^ task is: TASK: Gathering Facts 18699 1726882355.33110: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882355.33112: getting variables 18699 1726882355.33113: in VariableManager get_vars() 18699 1726882355.33121: Calling all_inventory to load vars for managed_node1 18699 1726882355.33123: Calling groups_inventory to load vars for managed_node1 18699 1726882355.33125: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882355.33131: Calling all_plugins_play to load vars for managed_node1 18699 1726882355.33133: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882355.33135: Calling groups_plugins_play to load vars for managed_node1 18699 1726882355.34961: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882355.36442: done with get_vars() 18699 1726882355.36464: done getting variables 18699 1726882355.36514: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:5 Friday 20 September 2024 21:32:35 -0400 (0:00:00.691) 0:00:28.961 ****** 18699 1726882355.36542: entering _queue_task() for managed_node1/gather_facts 18699 1726882355.36878: worker is 1 (out of 1 available) 18699 1726882355.36889: exiting _queue_task() for managed_node1/gather_facts 18699 1726882355.37101: done queuing things up, now waiting for results queue to drain 18699 1726882355.37102: waiting for pending results... 18699 1726882355.37170: running TaskExecutor() for managed_node1/TASK: Gathering Facts 18699 1726882355.37280: in run() - task 12673a56-9f93-1ce6-d207-0000000003f8 18699 1726882355.37303: variable 'ansible_search_path' from source: unknown 18699 1726882355.37345: calling self._execute() 18699 1726882355.37445: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882355.37456: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882355.37469: variable 'omit' from source: magic vars 18699 1726882355.37859: variable 'ansible_distribution_major_version' from source: facts 18699 1726882355.37878: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882355.37888: variable 'omit' from source: magic vars 18699 1726882355.37924: variable 'omit' from source: magic vars 18699 1726882355.37979: variable 'omit' from source: magic vars 18699 1726882355.38012: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18699 1726882355.38088: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18699 1726882355.38092: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18699 1726882355.38102: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882355.38118: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882355.38152: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18699 1726882355.38160: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882355.38167: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882355.38257: Set connection var ansible_connection to ssh 18699 1726882355.38306: Set connection var ansible_pipelining to False 18699 1726882355.38308: Set connection var ansible_shell_executable to /bin/sh 18699 1726882355.38310: Set connection var ansible_timeout to 10 18699 1726882355.38312: Set connection var ansible_shell_type to sh 18699 1726882355.38314: Set connection var ansible_module_compression to ZIP_DEFLATED 18699 1726882355.38326: variable 'ansible_shell_executable' from source: unknown 18699 1726882355.38332: variable 'ansible_connection' from source: unknown 18699 1726882355.38337: variable 'ansible_module_compression' from source: unknown 18699 1726882355.38342: variable 'ansible_shell_type' from source: unknown 18699 1726882355.38346: variable 'ansible_shell_executable' from source: unknown 18699 1726882355.38351: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882355.38356: variable 'ansible_pipelining' from source: unknown 18699 1726882355.38361: variable 'ansible_timeout' from source: unknown 18699 1726882355.38414: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882355.38557: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18699 1726882355.38572: variable 'omit' from source: magic vars 18699 1726882355.38582: starting attempt loop 18699 1726882355.38589: running the handler 18699 1726882355.38611: variable 'ansible_facts' from source: unknown 18699 1726882355.38638: _low_level_execute_command(): starting 18699 1726882355.38650: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18699 1726882355.39689: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882355.39697: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882355.39803: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882355.39827: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882355.40011: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882355.41752: stdout chunk (state=3): >>>/root <<< 18699 1726882355.42063: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882355.42068: stdout chunk (state=3): >>><<< 18699 1726882355.42070: stderr chunk (state=3): >>><<< 18699 1726882355.42073: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882355.42076: _low_level_execute_command(): starting 18699 1726882355.42078: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882355.4200485-20125-82954242341679 `" && echo ansible-tmp-1726882355.4200485-20125-82954242341679="` echo /root/.ansible/tmp/ansible-tmp-1726882355.4200485-20125-82954242341679 `" ) && sleep 0' 18699 1726882355.43347: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 18699 1726882355.43367: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882355.43438: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882355.43595: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882355.43633: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882355.45518: stdout chunk (state=3): >>>ansible-tmp-1726882355.4200485-20125-82954242341679=/root/.ansible/tmp/ansible-tmp-1726882355.4200485-20125-82954242341679 <<< 18699 1726882355.45717: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882355.45734: stdout chunk (state=3): >>><<< 18699 1726882355.45737: stderr chunk (state=3): >>><<< 18699 1726882355.45759: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882355.4200485-20125-82954242341679=/root/.ansible/tmp/ansible-tmp-1726882355.4200485-20125-82954242341679 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882355.45899: variable 'ansible_module_compression' from source: unknown 18699 1726882355.45950: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18699f6i6z5dg/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 18699 1726882355.46111: variable 'ansible_facts' from source: unknown 18699 1726882355.46598: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882355.4200485-20125-82954242341679/AnsiballZ_setup.py 18699 1726882355.47046: Sending initial data 18699 1726882355.47050: Sent initial data (153 bytes) 18699 1726882355.47989: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882355.48304: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882355.48314: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882355.48392: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882355.49914: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 18699 1726882355.50123: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18699 1726882355.50168: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18699 1726882355.50278: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmpbemesjx5 /root/.ansible/tmp/ansible-tmp-1726882355.4200485-20125-82954242341679/AnsiballZ_setup.py <<< 18699 1726882355.50281: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882355.4200485-20125-82954242341679/AnsiballZ_setup.py" <<< 18699 1726882355.50314: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmpbemesjx5" to remote "/root/.ansible/tmp/ansible-tmp-1726882355.4200485-20125-82954242341679/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882355.4200485-20125-82954242341679/AnsiballZ_setup.py" <<< 18699 1726882355.53719: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882355.53723: stdout chunk (state=3): >>><<< 18699 1726882355.53725: stderr chunk (state=3): >>><<< 18699 1726882355.53727: done transferring module to remote 18699 1726882355.53728: _low_level_execute_command(): starting 18699 1726882355.53730: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882355.4200485-20125-82954242341679/ /root/.ansible/tmp/ansible-tmp-1726882355.4200485-20125-82954242341679/AnsiballZ_setup.py && sleep 0' 18699 1726882355.54706: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882355.55008: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882355.55215: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882355.55297: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882355.57072: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882355.57083: stdout chunk (state=3): >>><<< 18699 1726882355.57100: stderr chunk (state=3): >>><<< 18699 1726882355.57128: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882355.57137: _low_level_execute_command(): starting 18699 1726882355.57147: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882355.4200485-20125-82954242341679/AnsiballZ_setup.py && sleep 0' 18699 1726882355.58124: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882355.58140: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882355.58156: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882355.58175: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18699 1726882355.58198: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 18699 1726882355.58214: stderr chunk (state=3): >>>debug2: match not found <<< 18699 1726882355.58263: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 18699 1726882355.58311: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882355.58368: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882355.58386: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882355.58406: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882355.58503: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882356.22273: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC9sgyYGKGPd0JFIDKIZZNkcX78Ca8OmX4GnOCt150Ftpgzzfir9Dy2HOb7d6QbQheoi9HLkHb66U2LDdt7EnBGKnI12YAuydTDfqITc2L4W9cEeoy/f2rrMlBo6FN3SNQc2voCDsWius2gK2mtTTZZI0R33PguMmqTkwYVzP0hYplwSYh5Atl+XP7/xLRhhowanh9U6x2ahqfnNq5DInqi070bKk0xZ2g12Vg8kIRno8ZQmm+ujUUevRkZysHvnrnN01ZQhqzjo/Awn+Pft6LYleTBn+YU/HlPMWR4PsFcrtT3WRdF5samSvVwWuuOC+0td2zQN4nGpYLK+FmpNG4nDfGZV/xIBBblNRvzrhKgk3lDU5qkeQ/R0godRQGbv4J1kq+3WU2E3upqBYxXWUJLM5FirAxz8tKLmaPh8YZWMKcs3X9F2ySLEcnhe5R5F6LFSNx13zQSt7lGZOIgzhvWllcs4YVxcV1Y4rTJ8jEK2KgWua+bZinJPYUJqKTzO2E=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKk0X8hfHP7BSAAI8BDwrr4175ddN6MsanEqlp3oVMOvThKVXLpFXhvJPbq2IBTd3Wm12dL2vAW7/82zG63KYZk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDVN13dHSxa36Blsqt/Q8OyOA04CC7ZlvrS6zWL4aDyE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 52586 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 52586 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "32", "second": "35", "epoch": "1726882355", "epoch_int": "1726882355", "date": "2024-09-20", "time": "21:32:35", "iso8601_micro": "2024-09-21T01:32:35.849975Z", "iso8601": "2024-09-21T01:32:35Z", "iso8601_basic": "20240920T213235849975", "iso8601_basic_short": "20240920T213235", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fibre_channel_wwn": [], "ansible_is_chroot": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fips": false, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-159", "ansible_nodename": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d2d02cced42c36436217cb93f6b8e", "ansible_processor": ["0",<<< 18699 1726882356.22334: stdout chunk (state=3): >>> "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2945, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 586, "free": 2945}, "nocache": {"free": 3283, "used": 248}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_uuid": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 789, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794828288, "block_size": 4096, "block_total": 65519099, "block_available": 63914753, "block_used": 1604346, "inode_total": 131070960, "inode_available": 131029046, "inode_used": 41914, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_apparmor": {"status": "disabled"}, "ansible_loadavg": {"1m": 0.38623046875, "5m": 0.3125, "15m": 0.15771484375}, "ansible_local": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_interfaces": ["lo", "peerlsr27", "lsr27", "eth0"], "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "c2:5a:92:97:66:08", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::c05a:92ff:fe97:6608", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1030:bff:fea1:4223", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lsr27": {"device": "lsr27", "macaddress": "7a:fe:b4:01:4b:ee", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::78fe:b4ff:fe01:4bee", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::c05a:92ff:fe97:6608", "fe80::1030:bff:fea1:4223", "fe80::78fe:b4ff:fe01:4bee"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1030:bff:fea1:4223", "fe80::78fe:b4ff:fe01:4bee", "fe80::c05a:92ff:fe97:6608"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_lsb": {}, "ansible_iscsi_iqn": "", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 18699 1726882356.24416: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 18699 1726882356.24420: stdout chunk (state=3): >>><<< 18699 1726882356.24424: stderr chunk (state=3): >>><<< 18699 1726882356.24465: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC9sgyYGKGPd0JFIDKIZZNkcX78Ca8OmX4GnOCt150Ftpgzzfir9Dy2HOb7d6QbQheoi9HLkHb66U2LDdt7EnBGKnI12YAuydTDfqITc2L4W9cEeoy/f2rrMlBo6FN3SNQc2voCDsWius2gK2mtTTZZI0R33PguMmqTkwYVzP0hYplwSYh5Atl+XP7/xLRhhowanh9U6x2ahqfnNq5DInqi070bKk0xZ2g12Vg8kIRno8ZQmm+ujUUevRkZysHvnrnN01ZQhqzjo/Awn+Pft6LYleTBn+YU/HlPMWR4PsFcrtT3WRdF5samSvVwWuuOC+0td2zQN4nGpYLK+FmpNG4nDfGZV/xIBBblNRvzrhKgk3lDU5qkeQ/R0godRQGbv4J1kq+3WU2E3upqBYxXWUJLM5FirAxz8tKLmaPh8YZWMKcs3X9F2ySLEcnhe5R5F6LFSNx13zQSt7lGZOIgzhvWllcs4YVxcV1Y4rTJ8jEK2KgWua+bZinJPYUJqKTzO2E=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKk0X8hfHP7BSAAI8BDwrr4175ddN6MsanEqlp3oVMOvThKVXLpFXhvJPbq2IBTd3Wm12dL2vAW7/82zG63KYZk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDVN13dHSxa36Blsqt/Q8OyOA04CC7ZlvrS6zWL4aDyE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 52586 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 52586 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "32", "second": "35", "epoch": "1726882355", "epoch_int": "1726882355", "date": "2024-09-20", "time": "21:32:35", "iso8601_micro": "2024-09-21T01:32:35.849975Z", "iso8601": "2024-09-21T01:32:35Z", "iso8601_basic": "20240920T213235849975", "iso8601_basic_short": "20240920T213235", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fibre_channel_wwn": [], "ansible_is_chroot": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fips": false, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-159", "ansible_nodename": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d2d02cced42c36436217cb93f6b8e", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2945, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 586, "free": 2945}, "nocache": {"free": 3283, "used": 248}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_uuid": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 789, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794828288, "block_size": 4096, "block_total": 65519099, "block_available": 63914753, "block_used": 1604346, "inode_total": 131070960, "inode_available": 131029046, "inode_used": 41914, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_apparmor": {"status": "disabled"}, "ansible_loadavg": {"1m": 0.38623046875, "5m": 0.3125, "15m": 0.15771484375}, "ansible_local": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_interfaces": ["lo", "peerlsr27", "lsr27", "eth0"], "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "c2:5a:92:97:66:08", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::c05a:92ff:fe97:6608", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1030:bff:fea1:4223", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lsr27": {"device": "lsr27", "macaddress": "7a:fe:b4:01:4b:ee", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::78fe:b4ff:fe01:4bee", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::c05a:92ff:fe97:6608", "fe80::1030:bff:fea1:4223", "fe80::78fe:b4ff:fe01:4bee"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1030:bff:fea1:4223", "fe80::78fe:b4ff:fe01:4bee", "fe80::c05a:92ff:fe97:6608"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_lsb": {}, "ansible_iscsi_iqn": "", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 18699 1726882356.25051: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882355.4200485-20125-82954242341679/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18699 1726882356.25054: _low_level_execute_command(): starting 18699 1726882356.25057: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882355.4200485-20125-82954242341679/ > /dev/null 2>&1 && sleep 0' 18699 1726882356.25583: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882356.25602: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882356.25616: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882356.25650: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882356.25665: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18699 1726882356.25711: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882356.25769: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882356.25786: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882356.25868: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882356.25960: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882356.27778: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882356.27835: stderr chunk (state=3): >>><<< 18699 1726882356.27910: stdout chunk (state=3): >>><<< 18699 1726882356.27934: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882356.27947: handler run complete 18699 1726882356.28275: variable 'ansible_facts' from source: unknown 18699 1726882356.28514: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882356.29240: variable 'ansible_facts' from source: unknown 18699 1726882356.29612: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882356.29859: attempt loop complete, returning result 18699 1726882356.29870: _execute() done 18699 1726882356.29878: dumping result to json 18699 1726882356.30004: done dumping result, returning 18699 1726882356.30019: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [12673a56-9f93-1ce6-d207-0000000003f8] 18699 1726882356.30031: sending task result for task 12673a56-9f93-1ce6-d207-0000000003f8 18699 1726882356.31133: done sending task result for task 12673a56-9f93-1ce6-d207-0000000003f8 18699 1726882356.31137: WORKER PROCESS EXITING ok: [managed_node1] 18699 1726882356.31651: no more pending results, returning what we have 18699 1726882356.31654: results queue empty 18699 1726882356.31655: checking for any_errors_fatal 18699 1726882356.31656: done checking for any_errors_fatal 18699 1726882356.31657: checking for max_fail_percentage 18699 1726882356.31658: done checking for max_fail_percentage 18699 1726882356.31659: checking to see if all hosts have failed and the running result is not ok 18699 1726882356.31660: done checking to see if all hosts have failed 18699 1726882356.31660: getting the remaining hosts for this loop 18699 1726882356.31662: done getting the remaining hosts for this loop 18699 1726882356.31665: getting the next task for host managed_node1 18699 1726882356.31669: done getting next task for host managed_node1 18699 1726882356.31671: ^ task is: TASK: meta (flush_handlers) 18699 1726882356.31673: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882356.31676: getting variables 18699 1726882356.31677: in VariableManager get_vars() 18699 1726882356.31699: Calling all_inventory to load vars for managed_node1 18699 1726882356.31701: Calling groups_inventory to load vars for managed_node1 18699 1726882356.31704: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882356.31714: Calling all_plugins_play to load vars for managed_node1 18699 1726882356.31717: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882356.31719: Calling groups_plugins_play to load vars for managed_node1 18699 1726882356.35143: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882356.38721: done with get_vars() 18699 1726882356.38744: done getting variables 18699 1726882356.38818: in VariableManager get_vars() 18699 1726882356.38829: Calling all_inventory to load vars for managed_node1 18699 1726882356.38832: Calling groups_inventory to load vars for managed_node1 18699 1726882356.38834: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882356.38840: Calling all_plugins_play to load vars for managed_node1 18699 1726882356.38842: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882356.38845: Calling groups_plugins_play to load vars for managed_node1 18699 1726882356.40870: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882356.44056: done with get_vars() 18699 1726882356.44092: done queuing things up, now waiting for results queue to drain 18699 1726882356.44201: results queue empty 18699 1726882356.44203: checking for any_errors_fatal 18699 1726882356.44207: done checking for any_errors_fatal 18699 1726882356.44213: checking for max_fail_percentage 18699 1726882356.44214: done checking for max_fail_percentage 18699 1726882356.44215: checking to see if all hosts have failed and the running result is not ok 18699 1726882356.44216: done checking to see if all hosts have failed 18699 1726882356.44216: getting the remaining hosts for this loop 18699 1726882356.44217: done getting the remaining hosts for this loop 18699 1726882356.44220: getting the next task for host managed_node1 18699 1726882356.44225: done getting next task for host managed_node1 18699 1726882356.44227: ^ task is: TASK: Include the task 'delete_interface.yml' 18699 1726882356.44229: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882356.44231: getting variables 18699 1726882356.44232: in VariableManager get_vars() 18699 1726882356.44303: Calling all_inventory to load vars for managed_node1 18699 1726882356.44306: Calling groups_inventory to load vars for managed_node1 18699 1726882356.44308: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882356.44314: Calling all_plugins_play to load vars for managed_node1 18699 1726882356.44316: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882356.44319: Calling groups_plugins_play to load vars for managed_node1 18699 1726882356.46858: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882356.56133: done with get_vars() 18699 1726882356.56198: done getting variables TASK [Include the task 'delete_interface.yml'] ********************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:8 Friday 20 September 2024 21:32:36 -0400 (0:00:01.197) 0:00:30.158 ****** 18699 1726882356.56274: entering _queue_task() for managed_node1/include_tasks 18699 1726882356.56824: worker is 1 (out of 1 available) 18699 1726882356.56836: exiting _queue_task() for managed_node1/include_tasks 18699 1726882356.56847: done queuing things up, now waiting for results queue to drain 18699 1726882356.56848: waiting for pending results... 18699 1726882356.57057: running TaskExecutor() for managed_node1/TASK: Include the task 'delete_interface.yml' 18699 1726882356.57182: in run() - task 12673a56-9f93-1ce6-d207-000000000054 18699 1726882356.57186: variable 'ansible_search_path' from source: unknown 18699 1726882356.57214: calling self._execute() 18699 1726882356.57324: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882356.57337: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882356.57371: variable 'omit' from source: magic vars 18699 1726882356.58051: variable 'ansible_distribution_major_version' from source: facts 18699 1726882356.58250: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882356.58254: _execute() done 18699 1726882356.58257: dumping result to json 18699 1726882356.58259: done dumping result, returning 18699 1726882356.58262: done running TaskExecutor() for managed_node1/TASK: Include the task 'delete_interface.yml' [12673a56-9f93-1ce6-d207-000000000054] 18699 1726882356.58264: sending task result for task 12673a56-9f93-1ce6-d207-000000000054 18699 1726882356.58343: done sending task result for task 12673a56-9f93-1ce6-d207-000000000054 18699 1726882356.58345: WORKER PROCESS EXITING 18699 1726882356.58391: no more pending results, returning what we have 18699 1726882356.58399: in VariableManager get_vars() 18699 1726882356.58439: Calling all_inventory to load vars for managed_node1 18699 1726882356.58442: Calling groups_inventory to load vars for managed_node1 18699 1726882356.58446: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882356.58460: Calling all_plugins_play to load vars for managed_node1 18699 1726882356.58497: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882356.58503: Calling groups_plugins_play to load vars for managed_node1 18699 1726882356.60295: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882356.62138: done with get_vars() 18699 1726882356.62159: variable 'ansible_search_path' from source: unknown 18699 1726882356.62173: we have included files to process 18699 1726882356.62174: generating all_blocks data 18699 1726882356.62175: done generating all_blocks data 18699 1726882356.62176: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 18699 1726882356.62177: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 18699 1726882356.62179: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 18699 1726882356.62416: done processing included file 18699 1726882356.62419: iterating over new_blocks loaded from include file 18699 1726882356.62420: in VariableManager get_vars() 18699 1726882356.62434: done with get_vars() 18699 1726882356.62436: filtering new block on tags 18699 1726882356.62451: done filtering new block on tags 18699 1726882356.62459: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml for managed_node1 18699 1726882356.62464: extending task lists for all hosts with included blocks 18699 1726882356.62498: done extending task lists 18699 1726882356.62500: done processing included files 18699 1726882356.62501: results queue empty 18699 1726882356.62501: checking for any_errors_fatal 18699 1726882356.62503: done checking for any_errors_fatal 18699 1726882356.62504: checking for max_fail_percentage 18699 1726882356.62505: done checking for max_fail_percentage 18699 1726882356.62506: checking to see if all hosts have failed and the running result is not ok 18699 1726882356.62506: done checking to see if all hosts have failed 18699 1726882356.62507: getting the remaining hosts for this loop 18699 1726882356.62508: done getting the remaining hosts for this loop 18699 1726882356.62511: getting the next task for host managed_node1 18699 1726882356.62514: done getting next task for host managed_node1 18699 1726882356.62517: ^ task is: TASK: Remove test interface if necessary 18699 1726882356.62519: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882356.62521: getting variables 18699 1726882356.62522: in VariableManager get_vars() 18699 1726882356.62531: Calling all_inventory to load vars for managed_node1 18699 1726882356.62533: Calling groups_inventory to load vars for managed_node1 18699 1726882356.62535: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882356.62540: Calling all_plugins_play to load vars for managed_node1 18699 1726882356.62542: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882356.62545: Calling groups_plugins_play to load vars for managed_node1 18699 1726882356.63802: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882356.65521: done with get_vars() 18699 1726882356.65543: done getting variables 18699 1726882356.65587: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Remove test interface if necessary] ************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml:3 Friday 20 September 2024 21:32:36 -0400 (0:00:00.093) 0:00:30.252 ****** 18699 1726882356.65630: entering _queue_task() for managed_node1/command 18699 1726882356.65982: worker is 1 (out of 1 available) 18699 1726882356.66002: exiting _queue_task() for managed_node1/command 18699 1726882356.66013: done queuing things up, now waiting for results queue to drain 18699 1726882356.66014: waiting for pending results... 18699 1726882356.66403: running TaskExecutor() for managed_node1/TASK: Remove test interface if necessary 18699 1726882356.66409: in run() - task 12673a56-9f93-1ce6-d207-000000000409 18699 1726882356.66411: variable 'ansible_search_path' from source: unknown 18699 1726882356.66420: variable 'ansible_search_path' from source: unknown 18699 1726882356.66464: calling self._execute() 18699 1726882356.66573: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882356.66585: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882356.66610: variable 'omit' from source: magic vars 18699 1726882356.67028: variable 'ansible_distribution_major_version' from source: facts 18699 1726882356.67054: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882356.67065: variable 'omit' from source: magic vars 18699 1726882356.67146: variable 'omit' from source: magic vars 18699 1726882356.67220: variable 'interface' from source: set_fact 18699 1726882356.67245: variable 'omit' from source: magic vars 18699 1726882356.67303: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18699 1726882356.67343: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18699 1726882356.67401: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18699 1726882356.67405: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882356.67423: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882356.67460: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18699 1726882356.67578: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882356.67581: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882356.67583: Set connection var ansible_connection to ssh 18699 1726882356.67585: Set connection var ansible_pipelining to False 18699 1726882356.67599: Set connection var ansible_shell_executable to /bin/sh 18699 1726882356.67609: Set connection var ansible_timeout to 10 18699 1726882356.67616: Set connection var ansible_shell_type to sh 18699 1726882356.67624: Set connection var ansible_module_compression to ZIP_DEFLATED 18699 1726882356.67652: variable 'ansible_shell_executable' from source: unknown 18699 1726882356.67659: variable 'ansible_connection' from source: unknown 18699 1726882356.67665: variable 'ansible_module_compression' from source: unknown 18699 1726882356.67670: variable 'ansible_shell_type' from source: unknown 18699 1726882356.67675: variable 'ansible_shell_executable' from source: unknown 18699 1726882356.67684: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882356.67691: variable 'ansible_pipelining' from source: unknown 18699 1726882356.67703: variable 'ansible_timeout' from source: unknown 18699 1726882356.67710: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882356.67844: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18699 1726882356.67859: variable 'omit' from source: magic vars 18699 1726882356.67867: starting attempt loop 18699 1726882356.67901: running the handler 18699 1726882356.67904: _low_level_execute_command(): starting 18699 1726882356.67906: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18699 1726882356.68677: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882356.68751: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882356.68773: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882356.68837: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882356.70500: stdout chunk (state=3): >>>/root <<< 18699 1726882356.70639: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882356.70653: stderr chunk (state=3): >>><<< 18699 1726882356.70664: stdout chunk (state=3): >>><<< 18699 1726882356.70704: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882356.70800: _low_level_execute_command(): starting 18699 1726882356.70805: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882356.7071118-20188-131271398839886 `" && echo ansible-tmp-1726882356.7071118-20188-131271398839886="` echo /root/.ansible/tmp/ansible-tmp-1726882356.7071118-20188-131271398839886 `" ) && sleep 0' 18699 1726882356.71371: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882356.71385: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882356.71406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882356.71466: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882356.71536: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882356.71564: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882356.71606: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882356.71657: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882356.73518: stdout chunk (state=3): >>>ansible-tmp-1726882356.7071118-20188-131271398839886=/root/.ansible/tmp/ansible-tmp-1726882356.7071118-20188-131271398839886 <<< 18699 1726882356.73702: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882356.73721: stdout chunk (state=3): >>><<< 18699 1726882356.73724: stderr chunk (state=3): >>><<< 18699 1726882356.73743: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882356.7071118-20188-131271398839886=/root/.ansible/tmp/ansible-tmp-1726882356.7071118-20188-131271398839886 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882356.73898: variable 'ansible_module_compression' from source: unknown 18699 1726882356.73903: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18699f6i6z5dg/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 18699 1726882356.73905: variable 'ansible_facts' from source: unknown 18699 1726882356.73976: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882356.7071118-20188-131271398839886/AnsiballZ_command.py 18699 1726882356.74143: Sending initial data 18699 1726882356.74153: Sent initial data (156 bytes) 18699 1726882356.74768: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882356.74783: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882356.74812: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882356.74847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882356.74935: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18699 1726882356.74953: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882356.74999: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882356.75044: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882356.76733: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18699 1726882356.76772: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18699 1726882356.76829: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmpkwvbl3fg /root/.ansible/tmp/ansible-tmp-1726882356.7071118-20188-131271398839886/AnsiballZ_command.py <<< 18699 1726882356.76832: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882356.7071118-20188-131271398839886/AnsiballZ_command.py" <<< 18699 1726882356.76892: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmpkwvbl3fg" to remote "/root/.ansible/tmp/ansible-tmp-1726882356.7071118-20188-131271398839886/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882356.7071118-20188-131271398839886/AnsiballZ_command.py" <<< 18699 1726882356.78148: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882356.78157: stdout chunk (state=3): >>><<< 18699 1726882356.78169: stderr chunk (state=3): >>><<< 18699 1726882356.78192: done transferring module to remote 18699 1726882356.78215: _low_level_execute_command(): starting 18699 1726882356.78225: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882356.7071118-20188-131271398839886/ /root/.ansible/tmp/ansible-tmp-1726882356.7071118-20188-131271398839886/AnsiballZ_command.py && sleep 0' 18699 1726882356.78801: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882356.78817: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882356.78833: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882356.78851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18699 1726882356.78868: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 18699 1726882356.78881: stderr chunk (state=3): >>>debug2: match not found <<< 18699 1726882356.78903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882356.78925: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18699 1726882356.78938: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 18699 1726882356.78949: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 18699 1726882356.78960: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882356.78974: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882356.79057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882356.79079: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882356.79098: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882356.79190: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882356.80907: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882356.80937: stderr chunk (state=3): >>><<< 18699 1726882356.80948: stdout chunk (state=3): >>><<< 18699 1726882356.80975: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882356.80991: _low_level_execute_command(): starting 18699 1726882356.81005: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882356.7071118-20188-131271398839886/AnsiballZ_command.py && sleep 0' 18699 1726882356.81631: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882356.81657: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882356.81673: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882356.81694: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18699 1726882356.81711: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 18699 1726882356.81763: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882356.81825: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882356.81842: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882356.81871: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882356.81997: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882356.98159: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "lsr27"], "start": "2024-09-20 21:32:36.967518", "end": "2024-09-20 21:32:36.977195", "delta": "0:00:00.009677", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del lsr27", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 18699 1726882357.00025: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 18699 1726882357.00048: stderr chunk (state=3): >>><<< 18699 1726882357.00063: stdout chunk (state=3): >>><<< 18699 1726882357.00102: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "lsr27"], "start": "2024-09-20 21:32:36.967518", "end": "2024-09-20 21:32:36.977195", "delta": "0:00:00.009677", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del lsr27", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 18699 1726882357.00140: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del lsr27', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882356.7071118-20188-131271398839886/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18699 1726882357.00157: _low_level_execute_command(): starting 18699 1726882357.00167: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882356.7071118-20188-131271398839886/ > /dev/null 2>&1 && sleep 0' 18699 1726882357.00814: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 18699 1726882357.00909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882357.00939: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882357.00953: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882357.01029: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882357.02865: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882357.02934: stderr chunk (state=3): >>><<< 18699 1726882357.02951: stdout chunk (state=3): >>><<< 18699 1726882357.02972: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882357.02981: handler run complete 18699 1726882357.03018: Evaluated conditional (False): False 18699 1726882357.03122: attempt loop complete, returning result 18699 1726882357.03125: _execute() done 18699 1726882357.03127: dumping result to json 18699 1726882357.03129: done dumping result, returning 18699 1726882357.03131: done running TaskExecutor() for managed_node1/TASK: Remove test interface if necessary [12673a56-9f93-1ce6-d207-000000000409] 18699 1726882357.03133: sending task result for task 12673a56-9f93-1ce6-d207-000000000409 18699 1726882357.03204: done sending task result for task 12673a56-9f93-1ce6-d207-000000000409 18699 1726882357.03207: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "ip", "link", "del", "lsr27" ], "delta": "0:00:00.009677", "end": "2024-09-20 21:32:36.977195", "rc": 0, "start": "2024-09-20 21:32:36.967518" } 18699 1726882357.03361: no more pending results, returning what we have 18699 1726882357.03364: results queue empty 18699 1726882357.03365: checking for any_errors_fatal 18699 1726882357.03367: done checking for any_errors_fatal 18699 1726882357.03367: checking for max_fail_percentage 18699 1726882357.03369: done checking for max_fail_percentage 18699 1726882357.03370: checking to see if all hosts have failed and the running result is not ok 18699 1726882357.03371: done checking to see if all hosts have failed 18699 1726882357.03371: getting the remaining hosts for this loop 18699 1726882357.03373: done getting the remaining hosts for this loop 18699 1726882357.03376: getting the next task for host managed_node1 18699 1726882357.03384: done getting next task for host managed_node1 18699 1726882357.03386: ^ task is: TASK: meta (flush_handlers) 18699 1726882357.03388: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882357.03392: getting variables 18699 1726882357.03500: in VariableManager get_vars() 18699 1726882357.03532: Calling all_inventory to load vars for managed_node1 18699 1726882357.03534: Calling groups_inventory to load vars for managed_node1 18699 1726882357.03538: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882357.03549: Calling all_plugins_play to load vars for managed_node1 18699 1726882357.03551: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882357.03554: Calling groups_plugins_play to load vars for managed_node1 18699 1726882357.05074: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882357.06750: done with get_vars() 18699 1726882357.06780: done getting variables 18699 1726882357.06848: in VariableManager get_vars() 18699 1726882357.06859: Calling all_inventory to load vars for managed_node1 18699 1726882357.06861: Calling groups_inventory to load vars for managed_node1 18699 1726882357.06867: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882357.06872: Calling all_plugins_play to load vars for managed_node1 18699 1726882357.06874: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882357.06877: Calling groups_plugins_play to load vars for managed_node1 18699 1726882357.08317: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882357.09908: done with get_vars() 18699 1726882357.09936: done queuing things up, now waiting for results queue to drain 18699 1726882357.09938: results queue empty 18699 1726882357.09939: checking for any_errors_fatal 18699 1726882357.09942: done checking for any_errors_fatal 18699 1726882357.09942: checking for max_fail_percentage 18699 1726882357.09943: done checking for max_fail_percentage 18699 1726882357.09944: checking to see if all hosts have failed and the running result is not ok 18699 1726882357.09945: done checking to see if all hosts have failed 18699 1726882357.09945: getting the remaining hosts for this loop 18699 1726882357.09946: done getting the remaining hosts for this loop 18699 1726882357.09949: getting the next task for host managed_node1 18699 1726882357.09952: done getting next task for host managed_node1 18699 1726882357.09958: ^ task is: TASK: meta (flush_handlers) 18699 1726882357.09960: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882357.09963: getting variables 18699 1726882357.09964: in VariableManager get_vars() 18699 1726882357.09972: Calling all_inventory to load vars for managed_node1 18699 1726882357.09974: Calling groups_inventory to load vars for managed_node1 18699 1726882357.09976: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882357.09981: Calling all_plugins_play to load vars for managed_node1 18699 1726882357.09983: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882357.09985: Calling groups_plugins_play to load vars for managed_node1 18699 1726882357.12270: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882357.15910: done with get_vars() 18699 1726882357.15932: done getting variables 18699 1726882357.15983: in VariableManager get_vars() 18699 1726882357.15996: Calling all_inventory to load vars for managed_node1 18699 1726882357.15998: Calling groups_inventory to load vars for managed_node1 18699 1726882357.16000: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882357.16006: Calling all_plugins_play to load vars for managed_node1 18699 1726882357.16008: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882357.16011: Calling groups_plugins_play to load vars for managed_node1 18699 1726882357.17288: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882357.18975: done with get_vars() 18699 1726882357.19003: done queuing things up, now waiting for results queue to drain 18699 1726882357.19005: results queue empty 18699 1726882357.19006: checking for any_errors_fatal 18699 1726882357.19007: done checking for any_errors_fatal 18699 1726882357.19008: checking for max_fail_percentage 18699 1726882357.19009: done checking for max_fail_percentage 18699 1726882357.19010: checking to see if all hosts have failed and the running result is not ok 18699 1726882357.19011: done checking to see if all hosts have failed 18699 1726882357.19011: getting the remaining hosts for this loop 18699 1726882357.19012: done getting the remaining hosts for this loop 18699 1726882357.19015: getting the next task for host managed_node1 18699 1726882357.19018: done getting next task for host managed_node1 18699 1726882357.19019: ^ task is: None 18699 1726882357.19020: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882357.19021: done queuing things up, now waiting for results queue to drain 18699 1726882357.19022: results queue empty 18699 1726882357.19022: checking for any_errors_fatal 18699 1726882357.19023: done checking for any_errors_fatal 18699 1726882357.19024: checking for max_fail_percentage 18699 1726882357.19025: done checking for max_fail_percentage 18699 1726882357.19025: checking to see if all hosts have failed and the running result is not ok 18699 1726882357.19026: done checking to see if all hosts have failed 18699 1726882357.19027: getting the next task for host managed_node1 18699 1726882357.19029: done getting next task for host managed_node1 18699 1726882357.19029: ^ task is: None 18699 1726882357.19030: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882357.19065: in VariableManager get_vars() 18699 1726882357.19085: done with get_vars() 18699 1726882357.19090: in VariableManager get_vars() 18699 1726882357.19107: done with get_vars() 18699 1726882357.19111: variable 'omit' from source: magic vars 18699 1726882357.19227: variable 'profile' from source: play vars 18699 1726882357.19325: in VariableManager get_vars() 18699 1726882357.19339: done with get_vars() 18699 1726882357.19358: variable 'omit' from source: magic vars 18699 1726882357.19423: variable 'profile' from source: play vars PLAY [Remove {{ profile }}] **************************************************** 18699 1726882357.20117: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 18699 1726882357.20136: getting the remaining hosts for this loop 18699 1726882357.20138: done getting the remaining hosts for this loop 18699 1726882357.20140: getting the next task for host managed_node1 18699 1726882357.20142: done getting next task for host managed_node1 18699 1726882357.20144: ^ task is: TASK: Gathering Facts 18699 1726882357.20145: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882357.20147: getting variables 18699 1726882357.20148: in VariableManager get_vars() 18699 1726882357.20157: Calling all_inventory to load vars for managed_node1 18699 1726882357.20159: Calling groups_inventory to load vars for managed_node1 18699 1726882357.20160: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882357.20165: Calling all_plugins_play to load vars for managed_node1 18699 1726882357.20167: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882357.20169: Calling groups_plugins_play to load vars for managed_node1 18699 1726882357.22457: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882357.25033: done with get_vars() 18699 1726882357.25059: done getting variables 18699 1726882357.25111: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 Friday 20 September 2024 21:32:37 -0400 (0:00:00.595) 0:00:30.847 ****** 18699 1726882357.25138: entering _queue_task() for managed_node1/gather_facts 18699 1726882357.25483: worker is 1 (out of 1 available) 18699 1726882357.25600: exiting _queue_task() for managed_node1/gather_facts 18699 1726882357.25611: done queuing things up, now waiting for results queue to drain 18699 1726882357.25612: waiting for pending results... 18699 1726882357.25811: running TaskExecutor() for managed_node1/TASK: Gathering Facts 18699 1726882357.25900: in run() - task 12673a56-9f93-1ce6-d207-000000000417 18699 1726882357.25945: variable 'ansible_search_path' from source: unknown 18699 1726882357.25967: calling self._execute() 18699 1726882357.26075: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882357.26301: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882357.26305: variable 'omit' from source: magic vars 18699 1726882357.26480: variable 'ansible_distribution_major_version' from source: facts 18699 1726882357.26501: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882357.26511: variable 'omit' from source: magic vars 18699 1726882357.26547: variable 'omit' from source: magic vars 18699 1726882357.26585: variable 'omit' from source: magic vars 18699 1726882357.26636: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18699 1726882357.26672: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18699 1726882357.26700: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18699 1726882357.26723: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882357.26740: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882357.26774: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18699 1726882357.26856: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882357.26858: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882357.26889: Set connection var ansible_connection to ssh 18699 1726882357.26906: Set connection var ansible_pipelining to False 18699 1726882357.26917: Set connection var ansible_shell_executable to /bin/sh 18699 1726882357.26927: Set connection var ansible_timeout to 10 18699 1726882357.26933: Set connection var ansible_shell_type to sh 18699 1726882357.26941: Set connection var ansible_module_compression to ZIP_DEFLATED 18699 1726882357.26977: variable 'ansible_shell_executable' from source: unknown 18699 1726882357.26984: variable 'ansible_connection' from source: unknown 18699 1726882357.26992: variable 'ansible_module_compression' from source: unknown 18699 1726882357.27005: variable 'ansible_shell_type' from source: unknown 18699 1726882357.27012: variable 'ansible_shell_executable' from source: unknown 18699 1726882357.27018: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882357.27026: variable 'ansible_pipelining' from source: unknown 18699 1726882357.27032: variable 'ansible_timeout' from source: unknown 18699 1726882357.27038: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882357.27290: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18699 1726882357.27294: variable 'omit' from source: magic vars 18699 1726882357.27300: starting attempt loop 18699 1726882357.27302: running the handler 18699 1726882357.27304: variable 'ansible_facts' from source: unknown 18699 1726882357.27306: _low_level_execute_command(): starting 18699 1726882357.27308: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18699 1726882357.28025: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882357.28043: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882357.28065: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882357.28176: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882357.28190: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882357.28219: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882357.28408: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882357.30033: stdout chunk (state=3): >>>/root <<< 18699 1726882357.30165: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882357.30186: stdout chunk (state=3): >>><<< 18699 1726882357.30205: stderr chunk (state=3): >>><<< 18699 1726882357.30235: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882357.30254: _low_level_execute_command(): starting 18699 1726882357.30264: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882357.3024154-20226-54337777653380 `" && echo ansible-tmp-1726882357.3024154-20226-54337777653380="` echo /root/.ansible/tmp/ansible-tmp-1726882357.3024154-20226-54337777653380 `" ) && sleep 0' 18699 1726882357.31687: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882357.31705: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882357.31740: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882357.31756: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882357.31826: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882357.33721: stdout chunk (state=3): >>>ansible-tmp-1726882357.3024154-20226-54337777653380=/root/.ansible/tmp/ansible-tmp-1726882357.3024154-20226-54337777653380 <<< 18699 1726882357.33821: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882357.33877: stderr chunk (state=3): >>><<< 18699 1726882357.33916: stdout chunk (state=3): >>><<< 18699 1726882357.33944: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882357.3024154-20226-54337777653380=/root/.ansible/tmp/ansible-tmp-1726882357.3024154-20226-54337777653380 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882357.34132: variable 'ansible_module_compression' from source: unknown 18699 1726882357.34188: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18699f6i6z5dg/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 18699 1726882357.34258: variable 'ansible_facts' from source: unknown 18699 1726882357.34758: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882357.3024154-20226-54337777653380/AnsiballZ_setup.py 18699 1726882357.34980: Sending initial data 18699 1726882357.35067: Sent initial data (153 bytes) 18699 1726882357.36413: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882357.36434: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882357.36453: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882357.36475: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882357.36649: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882357.38149: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 18699 1726882357.38169: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18699 1726882357.38238: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18699 1726882357.38284: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmpgq0ymuyr /root/.ansible/tmp/ansible-tmp-1726882357.3024154-20226-54337777653380/AnsiballZ_setup.py <<< 18699 1726882357.38302: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882357.3024154-20226-54337777653380/AnsiballZ_setup.py" <<< 18699 1726882357.38420: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmpgq0ymuyr" to remote "/root/.ansible/tmp/ansible-tmp-1726882357.3024154-20226-54337777653380/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882357.3024154-20226-54337777653380/AnsiballZ_setup.py" <<< 18699 1726882357.41068: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882357.41080: stdout chunk (state=3): >>><<< 18699 1726882357.41105: stderr chunk (state=3): >>><<< 18699 1726882357.41131: done transferring module to remote 18699 1726882357.41218: _low_level_execute_command(): starting 18699 1726882357.41231: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882357.3024154-20226-54337777653380/ /root/.ansible/tmp/ansible-tmp-1726882357.3024154-20226-54337777653380/AnsiballZ_setup.py && sleep 0' 18699 1726882357.42075: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882357.42202: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882357.42207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882357.42238: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882357.42413: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882357.42487: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882357.44268: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882357.44281: stdout chunk (state=3): >>><<< 18699 1726882357.44469: stderr chunk (state=3): >>><<< 18699 1726882357.44481: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882357.44488: _low_level_execute_command(): starting 18699 1726882357.44491: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882357.3024154-20226-54337777653380/AnsiballZ_setup.py && sleep 0' 18699 1726882357.45901: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882357.45905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882357.45908: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 18699 1726882357.45913: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 18699 1726882357.45919: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882357.45965: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882357.45969: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882357.46049: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882357.46127: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882358.10257: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 52586 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 52586 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC9sgyYGKGPd0JFIDKIZZNkcX78Ca8OmX4GnOCt150Ftpgzzfir9Dy2HOb7d6QbQheoi9HLkHb66U2LDdt7EnBGKnI12YAuydTDfqITc2L4W9cEeoy/f2rrMlBo6FN3SNQc2voCDsWius2gK2mtTTZZI0R33PguMmqTkwYVzP0hYplwSYh5Atl+XP7/xLRhhowanh9U6x2ahqfnNq5DInqi070bKk0xZ2g12Vg8kIRno8ZQmm+ujUUevRkZysHvnrnN01ZQhqzjo/Awn+Pft6LYleTBn+YU/HlPMWR4PsFcrtT3WRdF5samSvVwWuuOC+0td2zQN4nGpYLK+FmpNG4nDfGZV/xIBBblNRvzrhKgk3lDU5qkeQ/R0godRQGbv4J1kq+3WU2E3upqBYxXWUJLM5FirAxz8tKLmaPh8YZWMKcs3X9F2ySLEcnhe5R5F6LFSNx13zQSt7lGZOIgzhvWllcs4YVxcV1Y4rTJ8jEK2KgWua+bZinJPYUJqKTzO2E=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKk0X8hfHP7BSAAI8BDwrr4175ddN6MsanEqlp3oVMOvThKVXLpFXhvJPbq2IBTd3Wm12dL2vAW7/82zG63KYZk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDVN13dHSxa36Blsqt/Q8OyOA04CC7ZlvrS6zWL4aDyE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "32", "second": "37", "epoch": "1726882357", "epoch_int": "1726882357", "date": "2024-09-20", "time": "21:32:37", "iso8601_micro": "2024-09-21T01:32:37.725930Z", "iso8601": "2024-09-21T01:32:37Z", "iso8601_basic": "20240920T213237725930", "iso8601_basic_short": "20240920T213237", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_loadavg": {"1m": 0.515625, "5m": 0.3408203125, "15m": 0.16796875}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_ver<<< 18699 1726882358.10274: stdout chunk (state=3): >>>sion": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-159", "ansible_nodename": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d2d02cced42c36436217cb93f6b8e", "ansible_lsb": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_fips": false, "ansible_is_chroot": false, "ansible_iscsi_iqn": "", "ansible_local": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2954, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 577, "free": 2954}, "nocache": {"free": 3292, "used": 239}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_uuid": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 790, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794828288, "block_size": 4096, "block_total": 65519099, "block_available": 63914753, "block_used": 1604346, "inode_total": 131070960, "inode_available": 131029046, "inode_used": 41914, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_apparmor": {"status": "disabled"}, "ansible_fibre_channel_wwn": [], "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127<<< 18699 1726882358.10311: stdout chunk (state=3): >>>.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1030:bff:fea1:4223", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::1030:bff:fea1:4223"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1030:bff:fea1:4223"]}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 18699 1726882358.12259: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 18699 1726882358.12263: stdout chunk (state=3): >>><<< 18699 1726882358.12265: stderr chunk (state=3): >>><<< 18699 1726882358.12401: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 52586 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 52586 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC9sgyYGKGPd0JFIDKIZZNkcX78Ca8OmX4GnOCt150Ftpgzzfir9Dy2HOb7d6QbQheoi9HLkHb66U2LDdt7EnBGKnI12YAuydTDfqITc2L4W9cEeoy/f2rrMlBo6FN3SNQc2voCDsWius2gK2mtTTZZI0R33PguMmqTkwYVzP0hYplwSYh5Atl+XP7/xLRhhowanh9U6x2ahqfnNq5DInqi070bKk0xZ2g12Vg8kIRno8ZQmm+ujUUevRkZysHvnrnN01ZQhqzjo/Awn+Pft6LYleTBn+YU/HlPMWR4PsFcrtT3WRdF5samSvVwWuuOC+0td2zQN4nGpYLK+FmpNG4nDfGZV/xIBBblNRvzrhKgk3lDU5qkeQ/R0godRQGbv4J1kq+3WU2E3upqBYxXWUJLM5FirAxz8tKLmaPh8YZWMKcs3X9F2ySLEcnhe5R5F6LFSNx13zQSt7lGZOIgzhvWllcs4YVxcV1Y4rTJ8jEK2KgWua+bZinJPYUJqKTzO2E=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKk0X8hfHP7BSAAI8BDwrr4175ddN6MsanEqlp3oVMOvThKVXLpFXhvJPbq2IBTd3Wm12dL2vAW7/82zG63KYZk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDVN13dHSxa36Blsqt/Q8OyOA04CC7ZlvrS6zWL4aDyE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "32", "second": "37", "epoch": "1726882357", "epoch_int": "1726882357", "date": "2024-09-20", "time": "21:32:37", "iso8601_micro": "2024-09-21T01:32:37.725930Z", "iso8601": "2024-09-21T01:32:37Z", "iso8601_basic": "20240920T213237725930", "iso8601_basic_short": "20240920T213237", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_loadavg": {"1m": 0.515625, "5m": 0.3408203125, "15m": 0.16796875}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-159", "ansible_nodename": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d2d02cced42c36436217cb93f6b8e", "ansible_lsb": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_fips": false, "ansible_is_chroot": false, "ansible_iscsi_iqn": "", "ansible_local": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2954, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 577, "free": 2954}, "nocache": {"free": 3292, "used": 239}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_uuid": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 790, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794828288, "block_size": 4096, "block_total": 65519099, "block_available": 63914753, "block_used": 1604346, "inode_total": 131070960, "inode_available": 131029046, "inode_used": 41914, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_apparmor": {"status": "disabled"}, "ansible_fibre_channel_wwn": [], "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1030:bff:fea1:4223", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::1030:bff:fea1:4223"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1030:bff:fea1:4223"]}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 18699 1726882358.12700: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882357.3024154-20226-54337777653380/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18699 1726882358.12729: _low_level_execute_command(): starting 18699 1726882358.12747: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882357.3024154-20226-54337777653380/ > /dev/null 2>&1 && sleep 0' 18699 1726882358.13392: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882358.13516: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882358.13538: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882358.13623: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882358.15581: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882358.15585: stdout chunk (state=3): >>><<< 18699 1726882358.15588: stderr chunk (state=3): >>><<< 18699 1726882358.16304: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882358.16308: handler run complete 18699 1726882358.16310: variable 'ansible_facts' from source: unknown 18699 1726882358.16544: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882358.17230: variable 'ansible_facts' from source: unknown 18699 1726882358.17320: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882358.17630: attempt loop complete, returning result 18699 1726882358.18002: _execute() done 18699 1726882358.18005: dumping result to json 18699 1726882358.18008: done dumping result, returning 18699 1726882358.18010: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [12673a56-9f93-1ce6-d207-000000000417] 18699 1726882358.18012: sending task result for task 12673a56-9f93-1ce6-d207-000000000417 ok: [managed_node1] 18699 1726882358.18674: no more pending results, returning what we have 18699 1726882358.18678: results queue empty 18699 1726882358.18679: checking for any_errors_fatal 18699 1726882358.18680: done checking for any_errors_fatal 18699 1726882358.18681: checking for max_fail_percentage 18699 1726882358.18683: done checking for max_fail_percentage 18699 1726882358.18684: checking to see if all hosts have failed and the running result is not ok 18699 1726882358.18685: done checking to see if all hosts have failed 18699 1726882358.18685: getting the remaining hosts for this loop 18699 1726882358.18687: done getting the remaining hosts for this loop 18699 1726882358.18691: getting the next task for host managed_node1 18699 1726882358.18799: done getting next task for host managed_node1 18699 1726882358.18802: ^ task is: TASK: meta (flush_handlers) 18699 1726882358.18804: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882358.18808: getting variables 18699 1726882358.18810: in VariableManager get_vars() 18699 1726882358.18932: Calling all_inventory to load vars for managed_node1 18699 1726882358.18935: Calling groups_inventory to load vars for managed_node1 18699 1726882358.18937: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882358.18949: Calling all_plugins_play to load vars for managed_node1 18699 1726882358.18952: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882358.18957: Calling groups_plugins_play to load vars for managed_node1 18699 1726882358.19704: done sending task result for task 12673a56-9f93-1ce6-d207-000000000417 18699 1726882358.19708: WORKER PROCESS EXITING 18699 1726882358.21944: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882358.23816: done with get_vars() 18699 1726882358.23841: done getting variables 18699 1726882358.23921: in VariableManager get_vars() 18699 1726882358.23935: Calling all_inventory to load vars for managed_node1 18699 1726882358.23938: Calling groups_inventory to load vars for managed_node1 18699 1726882358.23940: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882358.23945: Calling all_plugins_play to load vars for managed_node1 18699 1726882358.23947: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882358.23950: Calling groups_plugins_play to load vars for managed_node1 18699 1726882358.25097: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882358.26645: done with get_vars() 18699 1726882358.26672: done queuing things up, now waiting for results queue to drain 18699 1726882358.26674: results queue empty 18699 1726882358.26675: checking for any_errors_fatal 18699 1726882358.26679: done checking for any_errors_fatal 18699 1726882358.26679: checking for max_fail_percentage 18699 1726882358.26680: done checking for max_fail_percentage 18699 1726882358.26681: checking to see if all hosts have failed and the running result is not ok 18699 1726882358.26682: done checking to see if all hosts have failed 18699 1726882358.26687: getting the remaining hosts for this loop 18699 1726882358.26688: done getting the remaining hosts for this loop 18699 1726882358.26691: getting the next task for host managed_node1 18699 1726882358.26699: done getting next task for host managed_node1 18699 1726882358.26702: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 18699 1726882358.26704: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882358.26714: getting variables 18699 1726882358.26715: in VariableManager get_vars() 18699 1726882358.26730: Calling all_inventory to load vars for managed_node1 18699 1726882358.26732: Calling groups_inventory to load vars for managed_node1 18699 1726882358.26734: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882358.26740: Calling all_plugins_play to load vars for managed_node1 18699 1726882358.26742: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882358.26745: Calling groups_plugins_play to load vars for managed_node1 18699 1726882358.27921: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882358.29445: done with get_vars() 18699 1726882358.29466: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:32:38 -0400 (0:00:01.044) 0:00:31.891 ****** 18699 1726882358.29552: entering _queue_task() for managed_node1/include_tasks 18699 1726882358.29910: worker is 1 (out of 1 available) 18699 1726882358.29922: exiting _queue_task() for managed_node1/include_tasks 18699 1726882358.29935: done queuing things up, now waiting for results queue to drain 18699 1726882358.29935: waiting for pending results... 18699 1726882358.30189: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 18699 1726882358.30325: in run() - task 12673a56-9f93-1ce6-d207-00000000005c 18699 1726882358.30350: variable 'ansible_search_path' from source: unknown 18699 1726882358.30358: variable 'ansible_search_path' from source: unknown 18699 1726882358.30405: calling self._execute() 18699 1726882358.30515: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882358.30526: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882358.30545: variable 'omit' from source: magic vars 18699 1726882358.30939: variable 'ansible_distribution_major_version' from source: facts 18699 1726882358.30958: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882358.30973: _execute() done 18699 1726882358.30982: dumping result to json 18699 1726882358.30991: done dumping result, returning 18699 1726882358.31081: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12673a56-9f93-1ce6-d207-00000000005c] 18699 1726882358.31084: sending task result for task 12673a56-9f93-1ce6-d207-00000000005c 18699 1726882358.31157: done sending task result for task 12673a56-9f93-1ce6-d207-00000000005c 18699 1726882358.31160: WORKER PROCESS EXITING 18699 1726882358.31204: no more pending results, returning what we have 18699 1726882358.31210: in VariableManager get_vars() 18699 1726882358.31256: Calling all_inventory to load vars for managed_node1 18699 1726882358.31259: Calling groups_inventory to load vars for managed_node1 18699 1726882358.31262: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882358.31275: Calling all_plugins_play to load vars for managed_node1 18699 1726882358.31278: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882358.31281: Calling groups_plugins_play to load vars for managed_node1 18699 1726882358.32812: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882358.34352: done with get_vars() 18699 1726882358.34373: variable 'ansible_search_path' from source: unknown 18699 1726882358.34374: variable 'ansible_search_path' from source: unknown 18699 1726882358.34407: we have included files to process 18699 1726882358.34408: generating all_blocks data 18699 1726882358.34409: done generating all_blocks data 18699 1726882358.34410: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 18699 1726882358.34411: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 18699 1726882358.34413: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 18699 1726882358.34983: done processing included file 18699 1726882358.34985: iterating over new_blocks loaded from include file 18699 1726882358.34986: in VariableManager get_vars() 18699 1726882358.35013: done with get_vars() 18699 1726882358.35015: filtering new block on tags 18699 1726882358.35031: done filtering new block on tags 18699 1726882358.35034: in VariableManager get_vars() 18699 1726882358.35055: done with get_vars() 18699 1726882358.35057: filtering new block on tags 18699 1726882358.35075: done filtering new block on tags 18699 1726882358.35077: in VariableManager get_vars() 18699 1726882358.35102: done with get_vars() 18699 1726882358.35104: filtering new block on tags 18699 1726882358.35120: done filtering new block on tags 18699 1726882358.35122: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node1 18699 1726882358.35127: extending task lists for all hosts with included blocks 18699 1726882358.35517: done extending task lists 18699 1726882358.35519: done processing included files 18699 1726882358.35520: results queue empty 18699 1726882358.35520: checking for any_errors_fatal 18699 1726882358.35522: done checking for any_errors_fatal 18699 1726882358.35523: checking for max_fail_percentage 18699 1726882358.35524: done checking for max_fail_percentage 18699 1726882358.35524: checking to see if all hosts have failed and the running result is not ok 18699 1726882358.35525: done checking to see if all hosts have failed 18699 1726882358.35526: getting the remaining hosts for this loop 18699 1726882358.35527: done getting the remaining hosts for this loop 18699 1726882358.35530: getting the next task for host managed_node1 18699 1726882358.35533: done getting next task for host managed_node1 18699 1726882358.35536: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 18699 1726882358.35539: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882358.35548: getting variables 18699 1726882358.35549: in VariableManager get_vars() 18699 1726882358.35563: Calling all_inventory to load vars for managed_node1 18699 1726882358.35565: Calling groups_inventory to load vars for managed_node1 18699 1726882358.35567: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882358.35572: Calling all_plugins_play to load vars for managed_node1 18699 1726882358.35575: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882358.35578: Calling groups_plugins_play to load vars for managed_node1 18699 1726882358.36824: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882358.38369: done with get_vars() 18699 1726882358.38390: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:32:38 -0400 (0:00:00.089) 0:00:31.980 ****** 18699 1726882358.38466: entering _queue_task() for managed_node1/setup 18699 1726882358.38831: worker is 1 (out of 1 available) 18699 1726882358.38843: exiting _queue_task() for managed_node1/setup 18699 1726882358.38854: done queuing things up, now waiting for results queue to drain 18699 1726882358.38855: waiting for pending results... 18699 1726882358.39224: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 18699 1726882358.39255: in run() - task 12673a56-9f93-1ce6-d207-000000000458 18699 1726882358.39275: variable 'ansible_search_path' from source: unknown 18699 1726882358.39284: variable 'ansible_search_path' from source: unknown 18699 1726882358.39330: calling self._execute() 18699 1726882358.39499: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882358.39505: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882358.39508: variable 'omit' from source: magic vars 18699 1726882358.39824: variable 'ansible_distribution_major_version' from source: facts 18699 1726882358.39844: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882358.40081: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18699 1726882358.42291: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18699 1726882358.42364: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18699 1726882358.42405: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18699 1726882358.42442: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18699 1726882358.42700: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18699 1726882358.42704: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882358.42707: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882358.42709: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882358.42711: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882358.42712: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882358.42714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882358.42734: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882358.42759: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882358.42798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882358.42814: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882358.42964: variable '__network_required_facts' from source: role '' defaults 18699 1726882358.42981: variable 'ansible_facts' from source: unknown 18699 1726882358.43767: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 18699 1726882358.43776: when evaluation is False, skipping this task 18699 1726882358.43784: _execute() done 18699 1726882358.43791: dumping result to json 18699 1726882358.43808: done dumping result, returning 18699 1726882358.43820: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12673a56-9f93-1ce6-d207-000000000458] 18699 1726882358.43830: sending task result for task 12673a56-9f93-1ce6-d207-000000000458 skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18699 1726882358.44063: no more pending results, returning what we have 18699 1726882358.44067: results queue empty 18699 1726882358.44068: checking for any_errors_fatal 18699 1726882358.44070: done checking for any_errors_fatal 18699 1726882358.44070: checking for max_fail_percentage 18699 1726882358.44072: done checking for max_fail_percentage 18699 1726882358.44073: checking to see if all hosts have failed and the running result is not ok 18699 1726882358.44073: done checking to see if all hosts have failed 18699 1726882358.44074: getting the remaining hosts for this loop 18699 1726882358.44075: done getting the remaining hosts for this loop 18699 1726882358.44079: getting the next task for host managed_node1 18699 1726882358.44088: done getting next task for host managed_node1 18699 1726882358.44092: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 18699 1726882358.44098: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882358.44110: getting variables 18699 1726882358.44112: in VariableManager get_vars() 18699 1726882358.44152: Calling all_inventory to load vars for managed_node1 18699 1726882358.44155: Calling groups_inventory to load vars for managed_node1 18699 1726882358.44157: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882358.44167: Calling all_plugins_play to load vars for managed_node1 18699 1726882358.44169: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882358.44172: Calling groups_plugins_play to load vars for managed_node1 18699 1726882358.44708: done sending task result for task 12673a56-9f93-1ce6-d207-000000000458 18699 1726882358.44712: WORKER PROCESS EXITING 18699 1726882358.45783: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882358.47519: done with get_vars() 18699 1726882358.47541: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:32:38 -0400 (0:00:00.091) 0:00:32.072 ****** 18699 1726882358.47632: entering _queue_task() for managed_node1/stat 18699 1726882358.47958: worker is 1 (out of 1 available) 18699 1726882358.47973: exiting _queue_task() for managed_node1/stat 18699 1726882358.47984: done queuing things up, now waiting for results queue to drain 18699 1726882358.47985: waiting for pending results... 18699 1726882358.48207: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 18699 1726882358.48352: in run() - task 12673a56-9f93-1ce6-d207-00000000045a 18699 1726882358.48374: variable 'ansible_search_path' from source: unknown 18699 1726882358.48383: variable 'ansible_search_path' from source: unknown 18699 1726882358.48433: calling self._execute() 18699 1726882358.48529: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882358.48541: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882358.48555: variable 'omit' from source: magic vars 18699 1726882358.48953: variable 'ansible_distribution_major_version' from source: facts 18699 1726882358.48974: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882358.49149: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18699 1726882358.49437: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18699 1726882358.49507: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18699 1726882358.49530: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18699 1726882358.49568: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18699 1726882358.49658: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18699 1726882358.49724: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18699 1726882358.49727: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882358.49739: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18699 1726882358.49821: variable '__network_is_ostree' from source: set_fact 18699 1726882358.49839: Evaluated conditional (not __network_is_ostree is defined): False 18699 1726882358.49846: when evaluation is False, skipping this task 18699 1726882358.49853: _execute() done 18699 1726882358.49860: dumping result to json 18699 1726882358.49869: done dumping result, returning 18699 1726882358.50000: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [12673a56-9f93-1ce6-d207-00000000045a] 18699 1726882358.50004: sending task result for task 12673a56-9f93-1ce6-d207-00000000045a 18699 1726882358.50072: done sending task result for task 12673a56-9f93-1ce6-d207-00000000045a 18699 1726882358.50075: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 18699 1726882358.50129: no more pending results, returning what we have 18699 1726882358.50133: results queue empty 18699 1726882358.50134: checking for any_errors_fatal 18699 1726882358.50140: done checking for any_errors_fatal 18699 1726882358.50141: checking for max_fail_percentage 18699 1726882358.50143: done checking for max_fail_percentage 18699 1726882358.50144: checking to see if all hosts have failed and the running result is not ok 18699 1726882358.50145: done checking to see if all hosts have failed 18699 1726882358.50145: getting the remaining hosts for this loop 18699 1726882358.50147: done getting the remaining hosts for this loop 18699 1726882358.50151: getting the next task for host managed_node1 18699 1726882358.50157: done getting next task for host managed_node1 18699 1726882358.50161: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 18699 1726882358.50164: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882358.50176: getting variables 18699 1726882358.50178: in VariableManager get_vars() 18699 1726882358.50219: Calling all_inventory to load vars for managed_node1 18699 1726882358.50222: Calling groups_inventory to load vars for managed_node1 18699 1726882358.50224: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882358.50234: Calling all_plugins_play to load vars for managed_node1 18699 1726882358.50237: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882358.50240: Calling groups_plugins_play to load vars for managed_node1 18699 1726882358.51853: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882358.53460: done with get_vars() 18699 1726882358.53490: done getting variables 18699 1726882358.53549: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:32:38 -0400 (0:00:00.059) 0:00:32.131 ****** 18699 1726882358.53585: entering _queue_task() for managed_node1/set_fact 18699 1726882358.53932: worker is 1 (out of 1 available) 18699 1726882358.53945: exiting _queue_task() for managed_node1/set_fact 18699 1726882358.53957: done queuing things up, now waiting for results queue to drain 18699 1726882358.53958: waiting for pending results... 18699 1726882358.54323: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 18699 1726882358.54377: in run() - task 12673a56-9f93-1ce6-d207-00000000045b 18699 1726882358.54404: variable 'ansible_search_path' from source: unknown 18699 1726882358.54416: variable 'ansible_search_path' from source: unknown 18699 1726882358.54462: calling self._execute() 18699 1726882358.54568: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882358.54640: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882358.54644: variable 'omit' from source: magic vars 18699 1726882358.54978: variable 'ansible_distribution_major_version' from source: facts 18699 1726882358.54999: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882358.55167: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18699 1726882358.55449: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18699 1726882358.55506: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18699 1726882358.55547: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18699 1726882358.55586: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18699 1726882358.55673: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18699 1726882358.55731: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18699 1726882358.55734: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882358.55756: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18699 1726882358.55847: variable '__network_is_ostree' from source: set_fact 18699 1726882358.55859: Evaluated conditional (not __network_is_ostree is defined): False 18699 1726882358.55866: when evaluation is False, skipping this task 18699 1726882358.55948: _execute() done 18699 1726882358.55951: dumping result to json 18699 1726882358.55953: done dumping result, returning 18699 1726882358.55955: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12673a56-9f93-1ce6-d207-00000000045b] 18699 1726882358.55957: sending task result for task 12673a56-9f93-1ce6-d207-00000000045b 18699 1726882358.56023: done sending task result for task 12673a56-9f93-1ce6-d207-00000000045b 18699 1726882358.56026: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 18699 1726882358.56105: no more pending results, returning what we have 18699 1726882358.56108: results queue empty 18699 1726882358.56109: checking for any_errors_fatal 18699 1726882358.56116: done checking for any_errors_fatal 18699 1726882358.56116: checking for max_fail_percentage 18699 1726882358.56118: done checking for max_fail_percentage 18699 1726882358.56119: checking to see if all hosts have failed and the running result is not ok 18699 1726882358.56120: done checking to see if all hosts have failed 18699 1726882358.56120: getting the remaining hosts for this loop 18699 1726882358.56122: done getting the remaining hosts for this loop 18699 1726882358.56126: getting the next task for host managed_node1 18699 1726882358.56135: done getting next task for host managed_node1 18699 1726882358.56140: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 18699 1726882358.56143: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882358.56156: getting variables 18699 1726882358.56158: in VariableManager get_vars() 18699 1726882358.56200: Calling all_inventory to load vars for managed_node1 18699 1726882358.56203: Calling groups_inventory to load vars for managed_node1 18699 1726882358.56206: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882358.56217: Calling all_plugins_play to load vars for managed_node1 18699 1726882358.56220: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882358.56222: Calling groups_plugins_play to load vars for managed_node1 18699 1726882358.58026: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882358.59598: done with get_vars() 18699 1726882358.59626: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:32:38 -0400 (0:00:00.061) 0:00:32.193 ****** 18699 1726882358.59716: entering _queue_task() for managed_node1/service_facts 18699 1726882358.60047: worker is 1 (out of 1 available) 18699 1726882358.60060: exiting _queue_task() for managed_node1/service_facts 18699 1726882358.60073: done queuing things up, now waiting for results queue to drain 18699 1726882358.60074: waiting for pending results... 18699 1726882358.60391: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running 18699 1726882358.60474: in run() - task 12673a56-9f93-1ce6-d207-00000000045d 18699 1726882358.60505: variable 'ansible_search_path' from source: unknown 18699 1726882358.60512: variable 'ansible_search_path' from source: unknown 18699 1726882358.60554: calling self._execute() 18699 1726882358.60654: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882358.60667: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882358.60704: variable 'omit' from source: magic vars 18699 1726882358.61154: variable 'ansible_distribution_major_version' from source: facts 18699 1726882358.61235: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882358.61239: variable 'omit' from source: magic vars 18699 1726882358.61379: variable 'omit' from source: magic vars 18699 1726882358.61699: variable 'omit' from source: magic vars 18699 1726882358.61702: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18699 1726882358.61706: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18699 1726882358.61708: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18699 1726882358.61710: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882358.61713: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882358.61715: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18699 1726882358.61717: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882358.61719: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882358.61773: Set connection var ansible_connection to ssh 18699 1726882358.61787: Set connection var ansible_pipelining to False 18699 1726882358.61803: Set connection var ansible_shell_executable to /bin/sh 18699 1726882358.61814: Set connection var ansible_timeout to 10 18699 1726882358.61820: Set connection var ansible_shell_type to sh 18699 1726882358.61829: Set connection var ansible_module_compression to ZIP_DEFLATED 18699 1726882358.61859: variable 'ansible_shell_executable' from source: unknown 18699 1726882358.61867: variable 'ansible_connection' from source: unknown 18699 1726882358.61875: variable 'ansible_module_compression' from source: unknown 18699 1726882358.61880: variable 'ansible_shell_type' from source: unknown 18699 1726882358.61886: variable 'ansible_shell_executable' from source: unknown 18699 1726882358.61892: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882358.61906: variable 'ansible_pipelining' from source: unknown 18699 1726882358.61913: variable 'ansible_timeout' from source: unknown 18699 1726882358.61920: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882358.62113: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18699 1726882358.62130: variable 'omit' from source: magic vars 18699 1726882358.62140: starting attempt loop 18699 1726882358.62147: running the handler 18699 1726882358.62165: _low_level_execute_command(): starting 18699 1726882358.62178: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18699 1726882358.62926: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882358.63025: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882358.63052: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882358.63136: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882358.64819: stdout chunk (state=3): >>>/root <<< 18699 1726882358.64919: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882358.64975: stderr chunk (state=3): >>><<< 18699 1726882358.64996: stdout chunk (state=3): >>><<< 18699 1726882358.65108: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882358.65112: _low_level_execute_command(): starting 18699 1726882358.65115: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882358.6501782-20281-10144602845014 `" && echo ansible-tmp-1726882358.6501782-20281-10144602845014="` echo /root/.ansible/tmp/ansible-tmp-1726882358.6501782-20281-10144602845014 `" ) && sleep 0' 18699 1726882358.65761: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882358.65834: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882358.65860: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882358.65882: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882358.66004: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882358.67865: stdout chunk (state=3): >>>ansible-tmp-1726882358.6501782-20281-10144602845014=/root/.ansible/tmp/ansible-tmp-1726882358.6501782-20281-10144602845014 <<< 18699 1726882358.68036: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882358.68039: stdout chunk (state=3): >>><<< 18699 1726882358.68042: stderr chunk (state=3): >>><<< 18699 1726882358.68058: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882358.6501782-20281-10144602845014=/root/.ansible/tmp/ansible-tmp-1726882358.6501782-20281-10144602845014 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882358.68112: variable 'ansible_module_compression' from source: unknown 18699 1726882358.68198: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18699f6i6z5dg/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 18699 1726882358.68215: variable 'ansible_facts' from source: unknown 18699 1726882358.68314: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882358.6501782-20281-10144602845014/AnsiballZ_service_facts.py 18699 1726882358.68538: Sending initial data 18699 1726882358.68541: Sent initial data (161 bytes) 18699 1726882358.69109: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882358.69167: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882358.69184: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882358.69208: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882358.69285: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882358.70850: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18699 1726882358.70939: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18699 1726882358.70951: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmpvn78ku3t /root/.ansible/tmp/ansible-tmp-1726882358.6501782-20281-10144602845014/AnsiballZ_service_facts.py <<< 18699 1726882358.70988: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882358.6501782-20281-10144602845014/AnsiballZ_service_facts.py" <<< 18699 1726882358.70992: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmpvn78ku3t" to remote "/root/.ansible/tmp/ansible-tmp-1726882358.6501782-20281-10144602845014/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882358.6501782-20281-10144602845014/AnsiballZ_service_facts.py" <<< 18699 1726882358.71796: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882358.71969: stderr chunk (state=3): >>><<< 18699 1726882358.71972: stdout chunk (state=3): >>><<< 18699 1726882358.71975: done transferring module to remote 18699 1726882358.71977: _low_level_execute_command(): starting 18699 1726882358.71979: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882358.6501782-20281-10144602845014/ /root/.ansible/tmp/ansible-tmp-1726882358.6501782-20281-10144602845014/AnsiballZ_service_facts.py && sleep 0' 18699 1726882358.72580: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882358.72598: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882358.72617: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882358.72737: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18699 1726882358.72747: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882358.72767: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882358.72837: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882358.74576: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882358.74589: stdout chunk (state=3): >>><<< 18699 1726882358.74615: stderr chunk (state=3): >>><<< 18699 1726882358.74701: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882358.74705: _low_level_execute_command(): starting 18699 1726882358.74708: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882358.6501782-20281-10144602845014/AnsiballZ_service_facts.py && sleep 0' 18699 1726882358.75302: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882358.75317: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882358.75332: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882358.75360: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18699 1726882358.75465: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882358.75489: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882358.75577: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882360.26516: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 18699 1726882360.27787: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882360.27915: stderr chunk (state=3): >>>Shared connection to 10.31.9.159 closed. <<< 18699 1726882360.27919: stderr chunk (state=3): >>><<< 18699 1726882360.28100: stdout chunk (state=3): >>><<< 18699 1726882360.28106: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 18699 1726882360.30255: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882358.6501782-20281-10144602845014/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18699 1726882360.30259: _low_level_execute_command(): starting 18699 1726882360.30262: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882358.6501782-20281-10144602845014/ > /dev/null 2>&1 && sleep 0' 18699 1726882360.31310: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882360.31328: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882360.31343: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882360.31472: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18699 1726882360.31490: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 18699 1726882360.31510: stderr chunk (state=3): >>>debug2: match not found <<< 18699 1726882360.31527: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882360.31784: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882360.31820: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882360.32017: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882360.33832: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882360.33842: stdout chunk (state=3): >>><<< 18699 1726882360.33855: stderr chunk (state=3): >>><<< 18699 1726882360.33916: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882360.33928: handler run complete 18699 1726882360.34327: variable 'ansible_facts' from source: unknown 18699 1726882360.34682: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882360.35681: variable 'ansible_facts' from source: unknown 18699 1726882360.36018: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882360.36457: attempt loop complete, returning result 18699 1726882360.36467: _execute() done 18699 1726882360.36473: dumping result to json 18699 1726882360.36622: done dumping result, returning 18699 1726882360.36640: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running [12673a56-9f93-1ce6-d207-00000000045d] 18699 1726882360.36649: sending task result for task 12673a56-9f93-1ce6-d207-00000000045d ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18699 1726882360.38489: no more pending results, returning what we have 18699 1726882360.38492: results queue empty 18699 1726882360.38494: checking for any_errors_fatal 18699 1726882360.38499: done checking for any_errors_fatal 18699 1726882360.38500: checking for max_fail_percentage 18699 1726882360.38501: done checking for max_fail_percentage 18699 1726882360.38502: checking to see if all hosts have failed and the running result is not ok 18699 1726882360.38502: done checking to see if all hosts have failed 18699 1726882360.38503: getting the remaining hosts for this loop 18699 1726882360.38504: done getting the remaining hosts for this loop 18699 1726882360.38507: getting the next task for host managed_node1 18699 1726882360.38512: done getting next task for host managed_node1 18699 1726882360.38515: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 18699 1726882360.38517: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882360.38525: getting variables 18699 1726882360.38527: in VariableManager get_vars() 18699 1726882360.38556: Calling all_inventory to load vars for managed_node1 18699 1726882360.38559: Calling groups_inventory to load vars for managed_node1 18699 1726882360.38561: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882360.38569: Calling all_plugins_play to load vars for managed_node1 18699 1726882360.38571: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882360.38574: Calling groups_plugins_play to load vars for managed_node1 18699 1726882360.39500: done sending task result for task 12673a56-9f93-1ce6-d207-00000000045d 18699 1726882360.39504: WORKER PROCESS EXITING 18699 1726882360.41811: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882360.44995: done with get_vars() 18699 1726882360.45028: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:32:40 -0400 (0:00:01.854) 0:00:34.047 ****** 18699 1726882360.45123: entering _queue_task() for managed_node1/package_facts 18699 1726882360.45879: worker is 1 (out of 1 available) 18699 1726882360.45892: exiting _queue_task() for managed_node1/package_facts 18699 1726882360.45909: done queuing things up, now waiting for results queue to drain 18699 1726882360.45910: waiting for pending results... 18699 1726882360.46501: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 18699 1726882360.46681: in run() - task 12673a56-9f93-1ce6-d207-00000000045e 18699 1726882360.47002: variable 'ansible_search_path' from source: unknown 18699 1726882360.47005: variable 'ansible_search_path' from source: unknown 18699 1726882360.47008: calling self._execute() 18699 1726882360.47153: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882360.47168: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882360.47178: variable 'omit' from source: magic vars 18699 1726882360.48016: variable 'ansible_distribution_major_version' from source: facts 18699 1726882360.48035: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882360.48103: variable 'omit' from source: magic vars 18699 1726882360.48174: variable 'omit' from source: magic vars 18699 1726882360.48440: variable 'omit' from source: magic vars 18699 1726882360.48599: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18699 1726882360.48603: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18699 1726882360.48606: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18699 1726882360.48609: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882360.48611: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882360.48742: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18699 1726882360.48751: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882360.48759: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882360.48867: Set connection var ansible_connection to ssh 18699 1726882360.48954: Set connection var ansible_pipelining to False 18699 1726882360.48966: Set connection var ansible_shell_executable to /bin/sh 18699 1726882360.48977: Set connection var ansible_timeout to 10 18699 1726882360.48984: Set connection var ansible_shell_type to sh 18699 1726882360.48999: Set connection var ansible_module_compression to ZIP_DEFLATED 18699 1726882360.49034: variable 'ansible_shell_executable' from source: unknown 18699 1726882360.49271: variable 'ansible_connection' from source: unknown 18699 1726882360.49275: variable 'ansible_module_compression' from source: unknown 18699 1726882360.49277: variable 'ansible_shell_type' from source: unknown 18699 1726882360.49279: variable 'ansible_shell_executable' from source: unknown 18699 1726882360.49281: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882360.49283: variable 'ansible_pipelining' from source: unknown 18699 1726882360.49285: variable 'ansible_timeout' from source: unknown 18699 1726882360.49287: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882360.49629: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18699 1726882360.49646: variable 'omit' from source: magic vars 18699 1726882360.49656: starting attempt loop 18699 1726882360.49663: running the handler 18699 1726882360.49683: _low_level_execute_command(): starting 18699 1726882360.49719: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18699 1726882360.51214: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882360.51354: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882360.51484: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882360.51513: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882360.51590: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882360.53171: stdout chunk (state=3): >>>/root <<< 18699 1726882360.53472: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882360.53475: stdout chunk (state=3): >>><<< 18699 1726882360.53478: stderr chunk (state=3): >>><<< 18699 1726882360.53504: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882360.53609: _low_level_execute_command(): starting 18699 1726882360.53613: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882360.5351472-20358-110996195848184 `" && echo ansible-tmp-1726882360.5351472-20358-110996195848184="` echo /root/.ansible/tmp/ansible-tmp-1726882360.5351472-20358-110996195848184 `" ) && sleep 0' 18699 1726882360.54813: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882360.56545: stdout chunk (state=3): >>>ansible-tmp-1726882360.5351472-20358-110996195848184=/root/.ansible/tmp/ansible-tmp-1726882360.5351472-20358-110996195848184 <<< 18699 1726882360.56679: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882360.56716: stderr chunk (state=3): >>><<< 18699 1726882360.56725: stdout chunk (state=3): >>><<< 18699 1726882360.56750: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882360.5351472-20358-110996195848184=/root/.ansible/tmp/ansible-tmp-1726882360.5351472-20358-110996195848184 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882360.56981: variable 'ansible_module_compression' from source: unknown 18699 1726882360.56997: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18699f6i6z5dg/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 18699 1726882360.57091: variable 'ansible_facts' from source: unknown 18699 1726882360.57482: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882360.5351472-20358-110996195848184/AnsiballZ_package_facts.py 18699 1726882360.57973: Sending initial data 18699 1726882360.57976: Sent initial data (162 bytes) 18699 1726882360.59006: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882360.59021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882360.59032: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882360.59083: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882360.59316: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882360.59370: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882360.61037: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 debug2: Sending SSH2_FXP_REALPATH "." debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882360.5351472-20358-110996195848184/AnsiballZ_package_facts.py" <<< 18699 1726882360.61044: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmp2f6mhs4c /root/.ansible/tmp/ansible-tmp-1726882360.5351472-20358-110996195848184/AnsiballZ_package_facts.py <<< 18699 1726882360.61048: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmp2f6mhs4c" to remote "/root/.ansible/tmp/ansible-tmp-1726882360.5351472-20358-110996195848184/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882360.5351472-20358-110996195848184/AnsiballZ_package_facts.py" <<< 18699 1726882360.63812: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882360.63816: stdout chunk (state=3): >>><<< 18699 1726882360.63818: stderr chunk (state=3): >>><<< 18699 1726882360.63820: done transferring module to remote 18699 1726882360.63821: _low_level_execute_command(): starting 18699 1726882360.63823: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882360.5351472-20358-110996195848184/ /root/.ansible/tmp/ansible-tmp-1726882360.5351472-20358-110996195848184/AnsiballZ_package_facts.py && sleep 0' 18699 1726882360.64916: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882360.65003: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882360.65018: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882360.65106: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18699 1726882360.65110: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 18699 1726882360.65282: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882360.65297: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882360.65312: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882360.65376: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882360.67208: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882360.67212: stderr chunk (state=3): >>><<< 18699 1726882360.67214: stdout chunk (state=3): >>><<< 18699 1726882360.67217: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882360.67224: _low_level_execute_command(): starting 18699 1726882360.67227: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882360.5351472-20358-110996195848184/AnsiballZ_package_facts.py && sleep 0' 18699 1726882360.68214: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882360.68218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 18699 1726882360.68221: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882360.68223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882360.68272: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882360.68512: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882360.68601: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882361.12316: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 18699 1726882361.12529: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10",<<< 18699 1726882361.12542: stdout chunk (state=3): >>> "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoc<<< 18699 1726882361.12555: stdout chunk (state=3): >>>h": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 18699 1726882361.14511: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 18699 1726882361.14515: stdout chunk (state=3): >>><<< 18699 1726882361.14517: stderr chunk (state=3): >>><<< 18699 1726882361.14605: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 18699 1726882361.18199: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882360.5351472-20358-110996195848184/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18699 1726882361.18235: _low_level_execute_command(): starting 18699 1726882361.18247: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882360.5351472-20358-110996195848184/ > /dev/null 2>&1 && sleep 0' 18699 1726882361.19003: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882361.19041: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882361.19068: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882361.19085: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882361.19201: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882361.21018: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882361.21029: stdout chunk (state=3): >>><<< 18699 1726882361.21042: stderr chunk (state=3): >>><<< 18699 1726882361.21066: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882361.21199: handler run complete 18699 1726882361.26848: variable 'ansible_facts' from source: unknown 18699 1726882361.27270: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882361.29161: variable 'ansible_facts' from source: unknown 18699 1726882361.29597: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882361.30312: attempt loop complete, returning result 18699 1726882361.30333: _execute() done 18699 1726882361.30345: dumping result to json 18699 1726882361.30576: done dumping result, returning 18699 1726882361.30590: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [12673a56-9f93-1ce6-d207-00000000045e] 18699 1726882361.30603: sending task result for task 12673a56-9f93-1ce6-d207-00000000045e 18699 1726882361.39321: done sending task result for task 12673a56-9f93-1ce6-d207-00000000045e 18699 1726882361.39325: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18699 1726882361.39429: no more pending results, returning what we have 18699 1726882361.39432: results queue empty 18699 1726882361.39433: checking for any_errors_fatal 18699 1726882361.39436: done checking for any_errors_fatal 18699 1726882361.39437: checking for max_fail_percentage 18699 1726882361.39438: done checking for max_fail_percentage 18699 1726882361.39438: checking to see if all hosts have failed and the running result is not ok 18699 1726882361.39439: done checking to see if all hosts have failed 18699 1726882361.39440: getting the remaining hosts for this loop 18699 1726882361.39441: done getting the remaining hosts for this loop 18699 1726882361.39444: getting the next task for host managed_node1 18699 1726882361.39449: done getting next task for host managed_node1 18699 1726882361.39451: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 18699 1726882361.39453: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882361.39461: getting variables 18699 1726882361.39462: in VariableManager get_vars() 18699 1726882361.39484: Calling all_inventory to load vars for managed_node1 18699 1726882361.39486: Calling groups_inventory to load vars for managed_node1 18699 1726882361.39488: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882361.39497: Calling all_plugins_play to load vars for managed_node1 18699 1726882361.39500: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882361.39503: Calling groups_plugins_play to load vars for managed_node1 18699 1726882361.41396: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882361.43321: done with get_vars() 18699 1726882361.43351: done getting variables 18699 1726882361.43404: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:32:41 -0400 (0:00:00.983) 0:00:35.030 ****** 18699 1726882361.43432: entering _queue_task() for managed_node1/debug 18699 1726882361.43778: worker is 1 (out of 1 available) 18699 1726882361.43790: exiting _queue_task() for managed_node1/debug 18699 1726882361.43903: done queuing things up, now waiting for results queue to drain 18699 1726882361.43904: waiting for pending results... 18699 1726882361.44112: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 18699 1726882361.44214: in run() - task 12673a56-9f93-1ce6-d207-00000000005d 18699 1726882361.44238: variable 'ansible_search_path' from source: unknown 18699 1726882361.44398: variable 'ansible_search_path' from source: unknown 18699 1726882361.44402: calling self._execute() 18699 1726882361.44405: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882361.44408: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882361.44411: variable 'omit' from source: magic vars 18699 1726882361.44787: variable 'ansible_distribution_major_version' from source: facts 18699 1726882361.44807: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882361.44820: variable 'omit' from source: magic vars 18699 1726882361.44870: variable 'omit' from source: magic vars 18699 1726882361.44974: variable 'network_provider' from source: set_fact 18699 1726882361.44999: variable 'omit' from source: magic vars 18699 1726882361.45041: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18699 1726882361.45083: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18699 1726882361.45112: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18699 1726882361.45135: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882361.45151: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882361.45192: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18699 1726882361.45204: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882361.45212: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882361.45318: Set connection var ansible_connection to ssh 18699 1726882361.45331: Set connection var ansible_pipelining to False 18699 1726882361.45340: Set connection var ansible_shell_executable to /bin/sh 18699 1726882361.45348: Set connection var ansible_timeout to 10 18699 1726882361.45354: Set connection var ansible_shell_type to sh 18699 1726882361.45362: Set connection var ansible_module_compression to ZIP_DEFLATED 18699 1726882361.45400: variable 'ansible_shell_executable' from source: unknown 18699 1726882361.45501: variable 'ansible_connection' from source: unknown 18699 1726882361.45504: variable 'ansible_module_compression' from source: unknown 18699 1726882361.45507: variable 'ansible_shell_type' from source: unknown 18699 1726882361.45509: variable 'ansible_shell_executable' from source: unknown 18699 1726882361.45511: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882361.45513: variable 'ansible_pipelining' from source: unknown 18699 1726882361.45515: variable 'ansible_timeout' from source: unknown 18699 1726882361.45518: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882361.45585: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18699 1726882361.45609: variable 'omit' from source: magic vars 18699 1726882361.45621: starting attempt loop 18699 1726882361.45629: running the handler 18699 1726882361.45678: handler run complete 18699 1726882361.45698: attempt loop complete, returning result 18699 1726882361.45706: _execute() done 18699 1726882361.45717: dumping result to json 18699 1726882361.45723: done dumping result, returning 18699 1726882361.45734: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [12673a56-9f93-1ce6-d207-00000000005d] 18699 1726882361.45741: sending task result for task 12673a56-9f93-1ce6-d207-00000000005d ok: [managed_node1] => {} MSG: Using network provider: nm 18699 1726882361.45964: no more pending results, returning what we have 18699 1726882361.45967: results queue empty 18699 1726882361.45968: checking for any_errors_fatal 18699 1726882361.45980: done checking for any_errors_fatal 18699 1726882361.45981: checking for max_fail_percentage 18699 1726882361.45983: done checking for max_fail_percentage 18699 1726882361.45984: checking to see if all hosts have failed and the running result is not ok 18699 1726882361.45985: done checking to see if all hosts have failed 18699 1726882361.45986: getting the remaining hosts for this loop 18699 1726882361.45987: done getting the remaining hosts for this loop 18699 1726882361.45991: getting the next task for host managed_node1 18699 1726882361.45999: done getting next task for host managed_node1 18699 1726882361.46004: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 18699 1726882361.46006: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882361.46016: getting variables 18699 1726882361.46018: in VariableManager get_vars() 18699 1726882361.46056: Calling all_inventory to load vars for managed_node1 18699 1726882361.46059: Calling groups_inventory to load vars for managed_node1 18699 1726882361.46062: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882361.46073: Calling all_plugins_play to load vars for managed_node1 18699 1726882361.46076: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882361.46079: Calling groups_plugins_play to load vars for managed_node1 18699 1726882361.46607: done sending task result for task 12673a56-9f93-1ce6-d207-00000000005d 18699 1726882361.46611: WORKER PROCESS EXITING 18699 1726882361.47695: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882361.50398: done with get_vars() 18699 1726882361.50429: done getting variables 18699 1726882361.50490: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:32:41 -0400 (0:00:00.073) 0:00:35.103 ****** 18699 1726882361.50788: entering _queue_task() for managed_node1/fail 18699 1726882361.51350: worker is 1 (out of 1 available) 18699 1726882361.51362: exiting _queue_task() for managed_node1/fail 18699 1726882361.51373: done queuing things up, now waiting for results queue to drain 18699 1726882361.51374: waiting for pending results... 18699 1726882361.51872: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 18699 1726882361.51997: in run() - task 12673a56-9f93-1ce6-d207-00000000005e 18699 1726882361.52145: variable 'ansible_search_path' from source: unknown 18699 1726882361.52154: variable 'ansible_search_path' from source: unknown 18699 1726882361.52302: calling self._execute() 18699 1726882361.52558: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882361.53000: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882361.53004: variable 'omit' from source: magic vars 18699 1726882361.53384: variable 'ansible_distribution_major_version' from source: facts 18699 1726882361.53799: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882361.53802: variable 'network_state' from source: role '' defaults 18699 1726882361.53806: Evaluated conditional (network_state != {}): False 18699 1726882361.53808: when evaluation is False, skipping this task 18699 1726882361.53810: _execute() done 18699 1726882361.53812: dumping result to json 18699 1726882361.53814: done dumping result, returning 18699 1726882361.53817: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12673a56-9f93-1ce6-d207-00000000005e] 18699 1726882361.53819: sending task result for task 12673a56-9f93-1ce6-d207-00000000005e 18699 1726882361.53896: done sending task result for task 12673a56-9f93-1ce6-d207-00000000005e skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 18699 1726882361.53944: no more pending results, returning what we have 18699 1726882361.53948: results queue empty 18699 1726882361.53949: checking for any_errors_fatal 18699 1726882361.53956: done checking for any_errors_fatal 18699 1726882361.53956: checking for max_fail_percentage 18699 1726882361.53958: done checking for max_fail_percentage 18699 1726882361.53959: checking to see if all hosts have failed and the running result is not ok 18699 1726882361.53960: done checking to see if all hosts have failed 18699 1726882361.53960: getting the remaining hosts for this loop 18699 1726882361.53962: done getting the remaining hosts for this loop 18699 1726882361.53966: getting the next task for host managed_node1 18699 1726882361.53972: done getting next task for host managed_node1 18699 1726882361.53976: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 18699 1726882361.53979: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882361.53992: getting variables 18699 1726882361.53996: in VariableManager get_vars() 18699 1726882361.54036: Calling all_inventory to load vars for managed_node1 18699 1726882361.54038: Calling groups_inventory to load vars for managed_node1 18699 1726882361.54040: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882361.54046: WORKER PROCESS EXITING 18699 1726882361.54057: Calling all_plugins_play to load vars for managed_node1 18699 1726882361.54059: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882361.54062: Calling groups_plugins_play to load vars for managed_node1 18699 1726882361.56655: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882361.59440: done with get_vars() 18699 1726882361.59465: done getting variables 18699 1726882361.59536: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:32:41 -0400 (0:00:00.087) 0:00:35.191 ****** 18699 1726882361.59567: entering _queue_task() for managed_node1/fail 18699 1726882361.60053: worker is 1 (out of 1 available) 18699 1726882361.60065: exiting _queue_task() for managed_node1/fail 18699 1726882361.60074: done queuing things up, now waiting for results queue to drain 18699 1726882361.60075: waiting for pending results... 18699 1726882361.60276: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 18699 1726882361.60399: in run() - task 12673a56-9f93-1ce6-d207-00000000005f 18699 1726882361.60426: variable 'ansible_search_path' from source: unknown 18699 1726882361.60436: variable 'ansible_search_path' from source: unknown 18699 1726882361.60479: calling self._execute() 18699 1726882361.60601: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882361.60617: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882361.60636: variable 'omit' from source: magic vars 18699 1726882361.61028: variable 'ansible_distribution_major_version' from source: facts 18699 1726882361.61052: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882361.61188: variable 'network_state' from source: role '' defaults 18699 1726882361.61263: Evaluated conditional (network_state != {}): False 18699 1726882361.61266: when evaluation is False, skipping this task 18699 1726882361.61269: _execute() done 18699 1726882361.61273: dumping result to json 18699 1726882361.61276: done dumping result, returning 18699 1726882361.61279: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12673a56-9f93-1ce6-d207-00000000005f] 18699 1726882361.61282: sending task result for task 12673a56-9f93-1ce6-d207-00000000005f 18699 1726882361.61357: done sending task result for task 12673a56-9f93-1ce6-d207-00000000005f skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 18699 1726882361.61410: no more pending results, returning what we have 18699 1726882361.61413: results queue empty 18699 1726882361.61414: checking for any_errors_fatal 18699 1726882361.61421: done checking for any_errors_fatal 18699 1726882361.61422: checking for max_fail_percentage 18699 1726882361.61424: done checking for max_fail_percentage 18699 1726882361.61425: checking to see if all hosts have failed and the running result is not ok 18699 1726882361.61425: done checking to see if all hosts have failed 18699 1726882361.61426: getting the remaining hosts for this loop 18699 1726882361.61427: done getting the remaining hosts for this loop 18699 1726882361.61432: getting the next task for host managed_node1 18699 1726882361.61439: done getting next task for host managed_node1 18699 1726882361.61443: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 18699 1726882361.61446: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882361.61460: getting variables 18699 1726882361.61462: in VariableManager get_vars() 18699 1726882361.61618: Calling all_inventory to load vars for managed_node1 18699 1726882361.61622: Calling groups_inventory to load vars for managed_node1 18699 1726882361.61625: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882361.61638: Calling all_plugins_play to load vars for managed_node1 18699 1726882361.61641: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882361.61643: Calling groups_plugins_play to load vars for managed_node1 18699 1726882361.62209: WORKER PROCESS EXITING 18699 1726882361.63176: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882361.64784: done with get_vars() 18699 1726882361.64813: done getting variables 18699 1726882361.64869: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:32:41 -0400 (0:00:00.053) 0:00:35.245 ****** 18699 1726882361.64899: entering _queue_task() for managed_node1/fail 18699 1726882361.65208: worker is 1 (out of 1 available) 18699 1726882361.65219: exiting _queue_task() for managed_node1/fail 18699 1726882361.65229: done queuing things up, now waiting for results queue to drain 18699 1726882361.65230: waiting for pending results... 18699 1726882361.65515: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 18699 1726882361.65629: in run() - task 12673a56-9f93-1ce6-d207-000000000060 18699 1726882361.65647: variable 'ansible_search_path' from source: unknown 18699 1726882361.65653: variable 'ansible_search_path' from source: unknown 18699 1726882361.65700: calling self._execute() 18699 1726882361.65799: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882361.65900: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882361.65904: variable 'omit' from source: magic vars 18699 1726882361.66187: variable 'ansible_distribution_major_version' from source: facts 18699 1726882361.66207: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882361.66376: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18699 1726882361.68582: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18699 1726882361.68657: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18699 1726882361.68702: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18699 1726882361.68749: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18699 1726882361.68781: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18699 1726882361.68866: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882361.69243: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882361.69282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882361.69328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882361.69350: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882361.69489: variable 'ansible_distribution_major_version' from source: facts 18699 1726882361.69495: Evaluated conditional (ansible_distribution_major_version | int > 9): True 18699 1726882361.69603: variable 'ansible_distribution' from source: facts 18699 1726882361.69613: variable '__network_rh_distros' from source: role '' defaults 18699 1726882361.69630: Evaluated conditional (ansible_distribution in __network_rh_distros): True 18699 1726882361.69890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882361.69998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882361.70002: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882361.70005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882361.70012: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882361.70070: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882361.70100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882361.70132: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882361.70183: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882361.70205: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882361.70258: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882361.70398: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882361.70402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882361.70404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882361.70407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882361.70692: variable 'network_connections' from source: play vars 18699 1726882361.70710: variable 'profile' from source: play vars 18699 1726882361.70778: variable 'profile' from source: play vars 18699 1726882361.70786: variable 'interface' from source: set_fact 18699 1726882361.70854: variable 'interface' from source: set_fact 18699 1726882361.70869: variable 'network_state' from source: role '' defaults 18699 1726882361.71200: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18699 1726882361.71379: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18699 1726882361.71509: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18699 1726882361.71549: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18699 1726882361.71582: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18699 1726882361.71677: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18699 1726882361.71766: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18699 1726882361.71798: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882361.71878: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18699 1726882361.72000: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 18699 1726882361.72003: when evaluation is False, skipping this task 18699 1726882361.72006: _execute() done 18699 1726882361.72008: dumping result to json 18699 1726882361.72010: done dumping result, returning 18699 1726882361.72025: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12673a56-9f93-1ce6-d207-000000000060] 18699 1726882361.72034: sending task result for task 12673a56-9f93-1ce6-d207-000000000060 skipping: [managed_node1] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 18699 1726882361.72546: no more pending results, returning what we have 18699 1726882361.72550: results queue empty 18699 1726882361.72551: checking for any_errors_fatal 18699 1726882361.72557: done checking for any_errors_fatal 18699 1726882361.72557: checking for max_fail_percentage 18699 1726882361.72559: done checking for max_fail_percentage 18699 1726882361.72560: checking to see if all hosts have failed and the running result is not ok 18699 1726882361.72561: done checking to see if all hosts have failed 18699 1726882361.72562: getting the remaining hosts for this loop 18699 1726882361.72563: done getting the remaining hosts for this loop 18699 1726882361.72567: getting the next task for host managed_node1 18699 1726882361.72575: done getting next task for host managed_node1 18699 1726882361.72579: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 18699 1726882361.72581: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882361.72595: getting variables 18699 1726882361.72597: in VariableManager get_vars() 18699 1726882361.72638: Calling all_inventory to load vars for managed_node1 18699 1726882361.72641: Calling groups_inventory to load vars for managed_node1 18699 1726882361.72644: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882361.72654: Calling all_plugins_play to load vars for managed_node1 18699 1726882361.72657: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882361.72660: Calling groups_plugins_play to load vars for managed_node1 18699 1726882361.73500: done sending task result for task 12673a56-9f93-1ce6-d207-000000000060 18699 1726882361.73503: WORKER PROCESS EXITING 18699 1726882361.75747: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882361.78636: done with get_vars() 18699 1726882361.78670: done getting variables 18699 1726882361.78734: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:32:41 -0400 (0:00:00.138) 0:00:35.383 ****** 18699 1726882361.78773: entering _queue_task() for managed_node1/dnf 18699 1726882361.79225: worker is 1 (out of 1 available) 18699 1726882361.79235: exiting _queue_task() for managed_node1/dnf 18699 1726882361.79244: done queuing things up, now waiting for results queue to drain 18699 1726882361.79245: waiting for pending results... 18699 1726882361.79445: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 18699 1726882361.79566: in run() - task 12673a56-9f93-1ce6-d207-000000000061 18699 1726882361.79592: variable 'ansible_search_path' from source: unknown 18699 1726882361.79603: variable 'ansible_search_path' from source: unknown 18699 1726882361.79647: calling self._execute() 18699 1726882361.79764: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882361.79782: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882361.79806: variable 'omit' from source: magic vars 18699 1726882361.80190: variable 'ansible_distribution_major_version' from source: facts 18699 1726882361.80212: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882361.80418: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18699 1726882361.82681: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18699 1726882361.82768: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18699 1726882361.82813: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18699 1726882361.82860: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18699 1726882361.82890: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18699 1726882361.82978: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882361.83028: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882361.83067: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882361.83114: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882361.83135: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882361.83273: variable 'ansible_distribution' from source: facts 18699 1726882361.83285: variable 'ansible_distribution_major_version' from source: facts 18699 1726882361.83307: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 18699 1726882361.83598: variable '__network_wireless_connections_defined' from source: role '' defaults 18699 1726882361.83601: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882361.83604: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882361.83621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882361.83666: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882361.83686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882361.83739: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882361.83768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882361.83799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882361.83848: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882361.83868: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882361.83912: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882361.83947: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882361.83975: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882361.84020: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882361.84045: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882361.84214: variable 'network_connections' from source: play vars 18699 1726882361.84232: variable 'profile' from source: play vars 18699 1726882361.84309: variable 'profile' from source: play vars 18699 1726882361.84377: variable 'interface' from source: set_fact 18699 1726882361.84384: variable 'interface' from source: set_fact 18699 1726882361.84460: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18699 1726882361.84637: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18699 1726882361.84677: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18699 1726882361.84719: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18699 1726882361.84754: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18699 1726882361.84801: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18699 1726882361.84837: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18699 1726882361.84901: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882361.84909: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18699 1726882361.84964: variable '__network_team_connections_defined' from source: role '' defaults 18699 1726882361.85219: variable 'network_connections' from source: play vars 18699 1726882361.85245: variable 'profile' from source: play vars 18699 1726882361.85355: variable 'profile' from source: play vars 18699 1726882361.85359: variable 'interface' from source: set_fact 18699 1726882361.85377: variable 'interface' from source: set_fact 18699 1726882361.85408: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 18699 1726882361.85417: when evaluation is False, skipping this task 18699 1726882361.85424: _execute() done 18699 1726882361.85430: dumping result to json 18699 1726882361.85437: done dumping result, returning 18699 1726882361.85450: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12673a56-9f93-1ce6-d207-000000000061] 18699 1726882361.85465: sending task result for task 12673a56-9f93-1ce6-d207-000000000061 18699 1726882361.85706: done sending task result for task 12673a56-9f93-1ce6-d207-000000000061 18699 1726882361.85710: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 18699 1726882361.85761: no more pending results, returning what we have 18699 1726882361.85764: results queue empty 18699 1726882361.85765: checking for any_errors_fatal 18699 1726882361.85773: done checking for any_errors_fatal 18699 1726882361.85774: checking for max_fail_percentage 18699 1726882361.85776: done checking for max_fail_percentage 18699 1726882361.85777: checking to see if all hosts have failed and the running result is not ok 18699 1726882361.85777: done checking to see if all hosts have failed 18699 1726882361.85778: getting the remaining hosts for this loop 18699 1726882361.85780: done getting the remaining hosts for this loop 18699 1726882361.85784: getting the next task for host managed_node1 18699 1726882361.85790: done getting next task for host managed_node1 18699 1726882361.85808: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 18699 1726882361.85810: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882361.85823: getting variables 18699 1726882361.85825: in VariableManager get_vars() 18699 1726882361.85867: Calling all_inventory to load vars for managed_node1 18699 1726882361.85871: Calling groups_inventory to load vars for managed_node1 18699 1726882361.85874: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882361.85884: Calling all_plugins_play to load vars for managed_node1 18699 1726882361.85887: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882361.85890: Calling groups_plugins_play to load vars for managed_node1 18699 1726882361.87531: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882361.89153: done with get_vars() 18699 1726882361.89182: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 18699 1726882361.89258: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:32:41 -0400 (0:00:00.105) 0:00:35.488 ****** 18699 1726882361.89295: entering _queue_task() for managed_node1/yum 18699 1726882361.89819: worker is 1 (out of 1 available) 18699 1726882361.89829: exiting _queue_task() for managed_node1/yum 18699 1726882361.89838: done queuing things up, now waiting for results queue to drain 18699 1726882361.89839: waiting for pending results... 18699 1726882361.89959: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 18699 1726882361.90078: in run() - task 12673a56-9f93-1ce6-d207-000000000062 18699 1726882361.90101: variable 'ansible_search_path' from source: unknown 18699 1726882361.90175: variable 'ansible_search_path' from source: unknown 18699 1726882361.90179: calling self._execute() 18699 1726882361.90260: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882361.90272: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882361.90292: variable 'omit' from source: magic vars 18699 1726882361.90684: variable 'ansible_distribution_major_version' from source: facts 18699 1726882361.90704: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882361.90891: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18699 1726882361.93449: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18699 1726882361.93522: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18699 1726882361.93667: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18699 1726882361.93670: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18699 1726882361.93673: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18699 1726882361.93725: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882361.93760: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882361.93802: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882361.93846: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882361.93867: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882361.93967: variable 'ansible_distribution_major_version' from source: facts 18699 1726882361.93997: Evaluated conditional (ansible_distribution_major_version | int < 8): False 18699 1726882361.94006: when evaluation is False, skipping this task 18699 1726882361.94013: _execute() done 18699 1726882361.94021: dumping result to json 18699 1726882361.94029: done dumping result, returning 18699 1726882361.94041: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12673a56-9f93-1ce6-d207-000000000062] 18699 1726882361.94052: sending task result for task 12673a56-9f93-1ce6-d207-000000000062 18699 1726882361.94259: done sending task result for task 12673a56-9f93-1ce6-d207-000000000062 18699 1726882361.94263: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 18699 1726882361.94321: no more pending results, returning what we have 18699 1726882361.94325: results queue empty 18699 1726882361.94326: checking for any_errors_fatal 18699 1726882361.94331: done checking for any_errors_fatal 18699 1726882361.94332: checking for max_fail_percentage 18699 1726882361.94335: done checking for max_fail_percentage 18699 1726882361.94336: checking to see if all hosts have failed and the running result is not ok 18699 1726882361.94336: done checking to see if all hosts have failed 18699 1726882361.94337: getting the remaining hosts for this loop 18699 1726882361.94339: done getting the remaining hosts for this loop 18699 1726882361.94342: getting the next task for host managed_node1 18699 1726882361.94349: done getting next task for host managed_node1 18699 1726882361.94353: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 18699 1726882361.94355: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882361.94368: getting variables 18699 1726882361.94370: in VariableManager get_vars() 18699 1726882361.94631: Calling all_inventory to load vars for managed_node1 18699 1726882361.94634: Calling groups_inventory to load vars for managed_node1 18699 1726882361.94636: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882361.94645: Calling all_plugins_play to load vars for managed_node1 18699 1726882361.94648: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882361.94651: Calling groups_plugins_play to load vars for managed_node1 18699 1726882361.96199: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882361.97820: done with get_vars() 18699 1726882361.97842: done getting variables 18699 1726882361.97908: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:32:41 -0400 (0:00:00.086) 0:00:35.575 ****** 18699 1726882361.97939: entering _queue_task() for managed_node1/fail 18699 1726882361.98325: worker is 1 (out of 1 available) 18699 1726882361.98337: exiting _queue_task() for managed_node1/fail 18699 1726882361.98346: done queuing things up, now waiting for results queue to drain 18699 1726882361.98347: waiting for pending results... 18699 1726882361.98710: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 18699 1726882361.98715: in run() - task 12673a56-9f93-1ce6-d207-000000000063 18699 1726882361.98719: variable 'ansible_search_path' from source: unknown 18699 1726882361.98723: variable 'ansible_search_path' from source: unknown 18699 1726882361.98769: calling self._execute() 18699 1726882361.98876: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882361.98887: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882361.98905: variable 'omit' from source: magic vars 18699 1726882361.99285: variable 'ansible_distribution_major_version' from source: facts 18699 1726882361.99305: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882361.99501: variable '__network_wireless_connections_defined' from source: role '' defaults 18699 1726882361.99629: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18699 1726882362.01878: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18699 1726882362.01956: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18699 1726882362.01997: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18699 1726882362.02045: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18699 1726882362.02074: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18699 1726882362.02163: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882362.02213: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882362.02254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882362.02399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882362.02403: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882362.02405: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882362.02407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882362.02428: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882362.02471: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882362.02489: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882362.02540: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882362.02568: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882362.02598: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882362.02647: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882362.02698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882362.02854: variable 'network_connections' from source: play vars 18699 1726882362.02870: variable 'profile' from source: play vars 18699 1726882362.02988: variable 'profile' from source: play vars 18699 1726882362.03007: variable 'interface' from source: set_fact 18699 1726882362.03181: variable 'interface' from source: set_fact 18699 1726882362.03185: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18699 1726882362.03356: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18699 1726882362.03404: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18699 1726882362.03439: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18699 1726882362.03470: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18699 1726882362.03524: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18699 1726882362.03549: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18699 1726882362.03580: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882362.03620: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18699 1726882362.03673: variable '__network_team_connections_defined' from source: role '' defaults 18699 1726882362.03930: variable 'network_connections' from source: play vars 18699 1726882362.03946: variable 'profile' from source: play vars 18699 1726882362.04052: variable 'profile' from source: play vars 18699 1726882362.04055: variable 'interface' from source: set_fact 18699 1726882362.04076: variable 'interface' from source: set_fact 18699 1726882362.04108: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 18699 1726882362.04115: when evaluation is False, skipping this task 18699 1726882362.04122: _execute() done 18699 1726882362.04128: dumping result to json 18699 1726882362.04134: done dumping result, returning 18699 1726882362.04145: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12673a56-9f93-1ce6-d207-000000000063] 18699 1726882362.04268: sending task result for task 12673a56-9f93-1ce6-d207-000000000063 18699 1726882362.04343: done sending task result for task 12673a56-9f93-1ce6-d207-000000000063 18699 1726882362.04347: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 18699 1726882362.04400: no more pending results, returning what we have 18699 1726882362.04404: results queue empty 18699 1726882362.04405: checking for any_errors_fatal 18699 1726882362.04411: done checking for any_errors_fatal 18699 1726882362.04412: checking for max_fail_percentage 18699 1726882362.04414: done checking for max_fail_percentage 18699 1726882362.04415: checking to see if all hosts have failed and the running result is not ok 18699 1726882362.04416: done checking to see if all hosts have failed 18699 1726882362.04417: getting the remaining hosts for this loop 18699 1726882362.04418: done getting the remaining hosts for this loop 18699 1726882362.04423: getting the next task for host managed_node1 18699 1726882362.04430: done getting next task for host managed_node1 18699 1726882362.04434: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 18699 1726882362.04436: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882362.04449: getting variables 18699 1726882362.04451: in VariableManager get_vars() 18699 1726882362.04701: Calling all_inventory to load vars for managed_node1 18699 1726882362.04705: Calling groups_inventory to load vars for managed_node1 18699 1726882362.04707: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882362.04717: Calling all_plugins_play to load vars for managed_node1 18699 1726882362.04720: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882362.04722: Calling groups_plugins_play to load vars for managed_node1 18699 1726882362.06922: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882362.08826: done with get_vars() 18699 1726882362.08849: done getting variables 18699 1726882362.08917: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:32:42 -0400 (0:00:00.110) 0:00:35.685 ****** 18699 1726882362.08950: entering _queue_task() for managed_node1/package 18699 1726882362.09359: worker is 1 (out of 1 available) 18699 1726882362.09372: exiting _queue_task() for managed_node1/package 18699 1726882362.09383: done queuing things up, now waiting for results queue to drain 18699 1726882362.09384: waiting for pending results... 18699 1726882362.09714: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 18699 1726882362.09812: in run() - task 12673a56-9f93-1ce6-d207-000000000064 18699 1726882362.09816: variable 'ansible_search_path' from source: unknown 18699 1726882362.09819: variable 'ansible_search_path' from source: unknown 18699 1726882362.09835: calling self._execute() 18699 1726882362.09944: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882362.09955: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882362.09969: variable 'omit' from source: magic vars 18699 1726882362.10361: variable 'ansible_distribution_major_version' from source: facts 18699 1726882362.10416: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882362.10591: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18699 1726882362.10872: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18699 1726882362.10927: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18699 1726882362.10970: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18699 1726882362.11047: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18699 1726882362.11179: variable 'network_packages' from source: role '' defaults 18699 1726882362.11333: variable '__network_provider_setup' from source: role '' defaults 18699 1726882362.11336: variable '__network_service_name_default_nm' from source: role '' defaults 18699 1726882362.11380: variable '__network_service_name_default_nm' from source: role '' defaults 18699 1726882362.11402: variable '__network_packages_default_nm' from source: role '' defaults 18699 1726882362.11471: variable '__network_packages_default_nm' from source: role '' defaults 18699 1726882362.11671: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18699 1726882362.13767: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18699 1726882362.13843: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18699 1726882362.13903: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18699 1726882362.13927: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18699 1726882362.14002: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18699 1726882362.14134: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882362.14137: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882362.14139: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882362.14174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882362.14197: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882362.14254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882362.14281: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882362.14311: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882362.14357: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882362.14373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882362.14612: variable '__network_packages_default_gobject_packages' from source: role '' defaults 18699 1726882362.14803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882362.15008: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882362.15011: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882362.15013: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882362.15015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882362.15017: variable 'ansible_python' from source: facts 18699 1726882362.15213: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 18699 1726882362.15303: variable '__network_wpa_supplicant_required' from source: role '' defaults 18699 1726882362.15390: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 18699 1726882362.15840: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882362.15867: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882362.15925: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882362.16041: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882362.16067: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882362.16299: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882362.16310: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882362.16328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882362.16367: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882362.16598: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882362.16703: variable 'network_connections' from source: play vars 18699 1726882362.16716: variable 'profile' from source: play vars 18699 1726882362.16824: variable 'profile' from source: play vars 18699 1726882362.16836: variable 'interface' from source: set_fact 18699 1726882362.16934: variable 'interface' from source: set_fact 18699 1726882362.17007: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18699 1726882362.17046: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18699 1726882362.17078: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882362.17115: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18699 1726882362.17175: variable '__network_wireless_connections_defined' from source: role '' defaults 18699 1726882362.17514: variable 'network_connections' from source: play vars 18699 1726882362.17524: variable 'profile' from source: play vars 18699 1726882362.17632: variable 'profile' from source: play vars 18699 1726882362.17644: variable 'interface' from source: set_fact 18699 1726882362.17726: variable 'interface' from source: set_fact 18699 1726882362.17789: variable '__network_packages_default_wireless' from source: role '' defaults 18699 1726882362.17852: variable '__network_wireless_connections_defined' from source: role '' defaults 18699 1726882362.18174: variable 'network_connections' from source: play vars 18699 1726882362.18184: variable 'profile' from source: play vars 18699 1726882362.18300: variable 'profile' from source: play vars 18699 1726882362.18303: variable 'interface' from source: set_fact 18699 1726882362.18402: variable 'interface' from source: set_fact 18699 1726882362.18433: variable '__network_packages_default_team' from source: role '' defaults 18699 1726882362.18525: variable '__network_team_connections_defined' from source: role '' defaults 18699 1726882362.18969: variable 'network_connections' from source: play vars 18699 1726882362.18979: variable 'profile' from source: play vars 18699 1726882362.19054: variable 'profile' from source: play vars 18699 1726882362.19099: variable 'interface' from source: set_fact 18699 1726882362.19169: variable 'interface' from source: set_fact 18699 1726882362.19232: variable '__network_service_name_default_initscripts' from source: role '' defaults 18699 1726882362.19287: variable '__network_service_name_default_initscripts' from source: role '' defaults 18699 1726882362.19301: variable '__network_packages_default_initscripts' from source: role '' defaults 18699 1726882362.19365: variable '__network_packages_default_initscripts' from source: role '' defaults 18699 1726882362.19596: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 18699 1726882362.20101: variable 'network_connections' from source: play vars 18699 1726882362.20210: variable 'profile' from source: play vars 18699 1726882362.20214: variable 'profile' from source: play vars 18699 1726882362.20216: variable 'interface' from source: set_fact 18699 1726882362.20241: variable 'interface' from source: set_fact 18699 1726882362.20254: variable 'ansible_distribution' from source: facts 18699 1726882362.20262: variable '__network_rh_distros' from source: role '' defaults 18699 1726882362.20271: variable 'ansible_distribution_major_version' from source: facts 18699 1726882362.20288: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 18699 1726882362.20468: variable 'ansible_distribution' from source: facts 18699 1726882362.20477: variable '__network_rh_distros' from source: role '' defaults 18699 1726882362.20484: variable 'ansible_distribution_major_version' from source: facts 18699 1726882362.20503: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 18699 1726882362.20664: variable 'ansible_distribution' from source: facts 18699 1726882362.20675: variable '__network_rh_distros' from source: role '' defaults 18699 1726882362.20684: variable 'ansible_distribution_major_version' from source: facts 18699 1726882362.20725: variable 'network_provider' from source: set_fact 18699 1726882362.20744: variable 'ansible_facts' from source: unknown 18699 1726882362.21498: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 18699 1726882362.21514: when evaluation is False, skipping this task 18699 1726882362.21522: _execute() done 18699 1726882362.21529: dumping result to json 18699 1726882362.21536: done dumping result, returning 18699 1726882362.21547: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [12673a56-9f93-1ce6-d207-000000000064] 18699 1726882362.21557: sending task result for task 12673a56-9f93-1ce6-d207-000000000064 18699 1726882362.21689: done sending task result for task 12673a56-9f93-1ce6-d207-000000000064 18699 1726882362.21692: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 18699 1726882362.21775: no more pending results, returning what we have 18699 1726882362.21779: results queue empty 18699 1726882362.21780: checking for any_errors_fatal 18699 1726882362.21787: done checking for any_errors_fatal 18699 1726882362.21788: checking for max_fail_percentage 18699 1726882362.21789: done checking for max_fail_percentage 18699 1726882362.21790: checking to see if all hosts have failed and the running result is not ok 18699 1726882362.21791: done checking to see if all hosts have failed 18699 1726882362.21792: getting the remaining hosts for this loop 18699 1726882362.21796: done getting the remaining hosts for this loop 18699 1726882362.21801: getting the next task for host managed_node1 18699 1726882362.21807: done getting next task for host managed_node1 18699 1726882362.21811: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 18699 1726882362.21813: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882362.21826: getting variables 18699 1726882362.21828: in VariableManager get_vars() 18699 1726882362.21865: Calling all_inventory to load vars for managed_node1 18699 1726882362.21868: Calling groups_inventory to load vars for managed_node1 18699 1726882362.21870: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882362.21885: Calling all_plugins_play to load vars for managed_node1 18699 1726882362.21888: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882362.21891: Calling groups_plugins_play to load vars for managed_node1 18699 1726882362.23552: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882362.25174: done with get_vars() 18699 1726882362.25198: done getting variables 18699 1726882362.25261: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:32:42 -0400 (0:00:00.163) 0:00:35.848 ****** 18699 1726882362.25292: entering _queue_task() for managed_node1/package 18699 1726882362.25624: worker is 1 (out of 1 available) 18699 1726882362.25636: exiting _queue_task() for managed_node1/package 18699 1726882362.25647: done queuing things up, now waiting for results queue to drain 18699 1726882362.25648: waiting for pending results... 18699 1726882362.26026: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 18699 1726882362.26044: in run() - task 12673a56-9f93-1ce6-d207-000000000065 18699 1726882362.26100: variable 'ansible_search_path' from source: unknown 18699 1726882362.26104: variable 'ansible_search_path' from source: unknown 18699 1726882362.26129: calling self._execute() 18699 1726882362.26238: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882362.26248: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882362.26260: variable 'omit' from source: magic vars 18699 1726882362.26648: variable 'ansible_distribution_major_version' from source: facts 18699 1726882362.26774: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882362.26791: variable 'network_state' from source: role '' defaults 18699 1726882362.26809: Evaluated conditional (network_state != {}): False 18699 1726882362.26817: when evaluation is False, skipping this task 18699 1726882362.26825: _execute() done 18699 1726882362.26831: dumping result to json 18699 1726882362.26839: done dumping result, returning 18699 1726882362.26851: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12673a56-9f93-1ce6-d207-000000000065] 18699 1726882362.26860: sending task result for task 12673a56-9f93-1ce6-d207-000000000065 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 18699 1726882362.27033: no more pending results, returning what we have 18699 1726882362.27037: results queue empty 18699 1726882362.27038: checking for any_errors_fatal 18699 1726882362.27047: done checking for any_errors_fatal 18699 1726882362.27048: checking for max_fail_percentage 18699 1726882362.27050: done checking for max_fail_percentage 18699 1726882362.27051: checking to see if all hosts have failed and the running result is not ok 18699 1726882362.27052: done checking to see if all hosts have failed 18699 1726882362.27052: getting the remaining hosts for this loop 18699 1726882362.27054: done getting the remaining hosts for this loop 18699 1726882362.27058: getting the next task for host managed_node1 18699 1726882362.27065: done getting next task for host managed_node1 18699 1726882362.27069: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 18699 1726882362.27071: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882362.27085: getting variables 18699 1726882362.27087: in VariableManager get_vars() 18699 1726882362.27325: Calling all_inventory to load vars for managed_node1 18699 1726882362.27329: Calling groups_inventory to load vars for managed_node1 18699 1726882362.27331: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882362.27341: Calling all_plugins_play to load vars for managed_node1 18699 1726882362.27344: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882362.27347: Calling groups_plugins_play to load vars for managed_node1 18699 1726882362.27909: done sending task result for task 12673a56-9f93-1ce6-d207-000000000065 18699 1726882362.27913: WORKER PROCESS EXITING 18699 1726882362.29113: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882362.30669: done with get_vars() 18699 1726882362.30690: done getting variables 18699 1726882362.30753: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:32:42 -0400 (0:00:00.054) 0:00:35.903 ****** 18699 1726882362.30783: entering _queue_task() for managed_node1/package 18699 1726882362.31131: worker is 1 (out of 1 available) 18699 1726882362.31142: exiting _queue_task() for managed_node1/package 18699 1726882362.31155: done queuing things up, now waiting for results queue to drain 18699 1726882362.31156: waiting for pending results... 18699 1726882362.31398: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 18699 1726882362.31519: in run() - task 12673a56-9f93-1ce6-d207-000000000066 18699 1726882362.31535: variable 'ansible_search_path' from source: unknown 18699 1726882362.31541: variable 'ansible_search_path' from source: unknown 18699 1726882362.31577: calling self._execute() 18699 1726882362.31679: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882362.31691: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882362.31710: variable 'omit' from source: magic vars 18699 1726882362.32088: variable 'ansible_distribution_major_version' from source: facts 18699 1726882362.32106: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882362.32220: variable 'network_state' from source: role '' defaults 18699 1726882362.32236: Evaluated conditional (network_state != {}): False 18699 1726882362.32243: when evaluation is False, skipping this task 18699 1726882362.32251: _execute() done 18699 1726882362.32258: dumping result to json 18699 1726882362.32266: done dumping result, returning 18699 1726882362.32286: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12673a56-9f93-1ce6-d207-000000000066] 18699 1726882362.32300: sending task result for task 12673a56-9f93-1ce6-d207-000000000066 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 18699 1726882362.32448: no more pending results, returning what we have 18699 1726882362.32452: results queue empty 18699 1726882362.32453: checking for any_errors_fatal 18699 1726882362.32460: done checking for any_errors_fatal 18699 1726882362.32461: checking for max_fail_percentage 18699 1726882362.32463: done checking for max_fail_percentage 18699 1726882362.32464: checking to see if all hosts have failed and the running result is not ok 18699 1726882362.32465: done checking to see if all hosts have failed 18699 1726882362.32465: getting the remaining hosts for this loop 18699 1726882362.32467: done getting the remaining hosts for this loop 18699 1726882362.32470: getting the next task for host managed_node1 18699 1726882362.32477: done getting next task for host managed_node1 18699 1726882362.32481: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 18699 1726882362.32483: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882362.32603: getting variables 18699 1726882362.32605: in VariableManager get_vars() 18699 1726882362.32640: Calling all_inventory to load vars for managed_node1 18699 1726882362.32643: Calling groups_inventory to load vars for managed_node1 18699 1726882362.32645: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882362.32656: Calling all_plugins_play to load vars for managed_node1 18699 1726882362.32658: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882362.32660: Calling groups_plugins_play to load vars for managed_node1 18699 1726882362.33207: done sending task result for task 12673a56-9f93-1ce6-d207-000000000066 18699 1726882362.33210: WORKER PROCESS EXITING 18699 1726882362.34152: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882362.35780: done with get_vars() 18699 1726882362.35807: done getting variables 18699 1726882362.35870: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:32:42 -0400 (0:00:00.051) 0:00:35.955 ****** 18699 1726882362.35907: entering _queue_task() for managed_node1/service 18699 1726882362.36425: worker is 1 (out of 1 available) 18699 1726882362.36434: exiting _queue_task() for managed_node1/service 18699 1726882362.36442: done queuing things up, now waiting for results queue to drain 18699 1726882362.36444: waiting for pending results... 18699 1726882362.36539: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 18699 1726882362.36658: in run() - task 12673a56-9f93-1ce6-d207-000000000067 18699 1726882362.36685: variable 'ansible_search_path' from source: unknown 18699 1726882362.36695: variable 'ansible_search_path' from source: unknown 18699 1726882362.36737: calling self._execute() 18699 1726882362.36842: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882362.36853: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882362.36868: variable 'omit' from source: magic vars 18699 1726882362.37259: variable 'ansible_distribution_major_version' from source: facts 18699 1726882362.37276: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882362.37408: variable '__network_wireless_connections_defined' from source: role '' defaults 18699 1726882362.37607: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18699 1726882362.40120: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18699 1726882362.40198: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18699 1726882362.40241: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18699 1726882362.40286: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18699 1726882362.40318: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18699 1726882362.40403: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882362.40442: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882362.40476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882362.40526: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882362.40543: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882362.40699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882362.40704: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882362.40706: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882362.40708: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882362.40711: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882362.40772: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882362.40851: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882362.40945: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882362.41067: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882362.41086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882362.41263: variable 'network_connections' from source: play vars 18699 1726882362.41280: variable 'profile' from source: play vars 18699 1726882362.41368: variable 'profile' from source: play vars 18699 1726882362.41458: variable 'interface' from source: set_fact 18699 1726882362.41461: variable 'interface' from source: set_fact 18699 1726882362.41531: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18699 1726882362.41726: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18699 1726882362.41764: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18699 1726882362.41807: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18699 1726882362.41839: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18699 1726882362.41883: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18699 1726882362.41919: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18699 1726882362.41950: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882362.41980: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18699 1726882362.42040: variable '__network_team_connections_defined' from source: role '' defaults 18699 1726882362.42289: variable 'network_connections' from source: play vars 18699 1726882362.42302: variable 'profile' from source: play vars 18699 1726882362.42498: variable 'profile' from source: play vars 18699 1726882362.42501: variable 'interface' from source: set_fact 18699 1726882362.42503: variable 'interface' from source: set_fact 18699 1726882362.42505: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 18699 1726882362.42507: when evaluation is False, skipping this task 18699 1726882362.42509: _execute() done 18699 1726882362.42511: dumping result to json 18699 1726882362.42513: done dumping result, returning 18699 1726882362.42515: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12673a56-9f93-1ce6-d207-000000000067] 18699 1726882362.42525: sending task result for task 12673a56-9f93-1ce6-d207-000000000067 skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 18699 1726882362.42684: no more pending results, returning what we have 18699 1726882362.42687: results queue empty 18699 1726882362.42689: checking for any_errors_fatal 18699 1726882362.42697: done checking for any_errors_fatal 18699 1726882362.42698: checking for max_fail_percentage 18699 1726882362.42700: done checking for max_fail_percentage 18699 1726882362.42701: checking to see if all hosts have failed and the running result is not ok 18699 1726882362.42702: done checking to see if all hosts have failed 18699 1726882362.42703: getting the remaining hosts for this loop 18699 1726882362.42704: done getting the remaining hosts for this loop 18699 1726882362.42709: getting the next task for host managed_node1 18699 1726882362.42715: done getting next task for host managed_node1 18699 1726882362.42719: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 18699 1726882362.42721: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882362.42735: getting variables 18699 1726882362.42737: in VariableManager get_vars() 18699 1726882362.42891: Calling all_inventory to load vars for managed_node1 18699 1726882362.42895: Calling groups_inventory to load vars for managed_node1 18699 1726882362.42898: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882362.42910: Calling all_plugins_play to load vars for managed_node1 18699 1726882362.42913: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882362.42916: Calling groups_plugins_play to load vars for managed_node1 18699 1726882362.43478: done sending task result for task 12673a56-9f93-1ce6-d207-000000000067 18699 1726882362.43481: WORKER PROCESS EXITING 18699 1726882362.44731: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882362.46332: done with get_vars() 18699 1726882362.46360: done getting variables 18699 1726882362.46422: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:32:42 -0400 (0:00:00.105) 0:00:36.060 ****** 18699 1726882362.46458: entering _queue_task() for managed_node1/service 18699 1726882362.46835: worker is 1 (out of 1 available) 18699 1726882362.46847: exiting _queue_task() for managed_node1/service 18699 1726882362.46859: done queuing things up, now waiting for results queue to drain 18699 1726882362.46860: waiting for pending results... 18699 1726882362.47155: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 18699 1726882362.47277: in run() - task 12673a56-9f93-1ce6-d207-000000000068 18699 1726882362.47301: variable 'ansible_search_path' from source: unknown 18699 1726882362.47312: variable 'ansible_search_path' from source: unknown 18699 1726882362.47431: calling self._execute() 18699 1726882362.47467: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882362.47477: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882362.47492: variable 'omit' from source: magic vars 18699 1726882362.47887: variable 'ansible_distribution_major_version' from source: facts 18699 1726882362.47907: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882362.48080: variable 'network_provider' from source: set_fact 18699 1726882362.48098: variable 'network_state' from source: role '' defaults 18699 1726882362.48116: Evaluated conditional (network_provider == "nm" or network_state != {}): True 18699 1726882362.48128: variable 'omit' from source: magic vars 18699 1726882362.48195: variable 'omit' from source: magic vars 18699 1726882362.48219: variable 'network_service_name' from source: role '' defaults 18699 1726882362.48406: variable 'network_service_name' from source: role '' defaults 18699 1726882362.48411: variable '__network_provider_setup' from source: role '' defaults 18699 1726882362.48418: variable '__network_service_name_default_nm' from source: role '' defaults 18699 1726882362.48485: variable '__network_service_name_default_nm' from source: role '' defaults 18699 1726882362.48503: variable '__network_packages_default_nm' from source: role '' defaults 18699 1726882362.48576: variable '__network_packages_default_nm' from source: role '' defaults 18699 1726882362.48814: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18699 1726882362.50979: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18699 1726882362.51063: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18699 1726882362.51115: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18699 1726882362.51174: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18699 1726882362.51244: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18699 1726882362.51300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882362.51340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882362.51380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882362.51427: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882362.51447: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882362.51698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882362.51702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882362.51704: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882362.51706: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882362.51709: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882362.51875: variable '__network_packages_default_gobject_packages' from source: role '' defaults 18699 1726882362.52003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882362.52033: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882362.52068: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882362.52114: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882362.52132: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882362.52234: variable 'ansible_python' from source: facts 18699 1726882362.52267: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 18699 1726882362.52347: variable '__network_wpa_supplicant_required' from source: role '' defaults 18699 1726882362.52435: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 18699 1726882362.52565: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882362.52604: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882362.52632: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882362.52671: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882362.52696: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882362.52748: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882362.52898: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882362.52901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882362.52904: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882362.52907: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882362.53046: variable 'network_connections' from source: play vars 18699 1726882362.53057: variable 'profile' from source: play vars 18699 1726882362.53138: variable 'profile' from source: play vars 18699 1726882362.53148: variable 'interface' from source: set_fact 18699 1726882362.53210: variable 'interface' from source: set_fact 18699 1726882362.53336: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18699 1726882362.53533: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18699 1726882362.53601: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18699 1726882362.53647: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18699 1726882362.53741: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18699 1726882362.53841: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18699 1726882362.53901: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18699 1726882362.53939: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882362.53974: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18699 1726882362.54124: variable '__network_wireless_connections_defined' from source: role '' defaults 18699 1726882362.54412: variable 'network_connections' from source: play vars 18699 1726882362.54424: variable 'profile' from source: play vars 18699 1726882362.54507: variable 'profile' from source: play vars 18699 1726882362.54518: variable 'interface' from source: set_fact 18699 1726882362.54601: variable 'interface' from source: set_fact 18699 1726882362.54643: variable '__network_packages_default_wireless' from source: role '' defaults 18699 1726882362.54748: variable '__network_wireless_connections_defined' from source: role '' defaults 18699 1726882362.55118: variable 'network_connections' from source: play vars 18699 1726882362.55128: variable 'profile' from source: play vars 18699 1726882362.55199: variable 'profile' from source: play vars 18699 1726882362.55302: variable 'interface' from source: set_fact 18699 1726882362.55305: variable 'interface' from source: set_fact 18699 1726882362.55339: variable '__network_packages_default_team' from source: role '' defaults 18699 1726882362.55432: variable '__network_team_connections_defined' from source: role '' defaults 18699 1726882362.55784: variable 'network_connections' from source: play vars 18699 1726882362.55881: variable 'profile' from source: play vars 18699 1726882362.55884: variable 'profile' from source: play vars 18699 1726882362.55886: variable 'interface' from source: set_fact 18699 1726882362.55974: variable 'interface' from source: set_fact 18699 1726882362.56041: variable '__network_service_name_default_initscripts' from source: role '' defaults 18699 1726882362.56113: variable '__network_service_name_default_initscripts' from source: role '' defaults 18699 1726882362.56135: variable '__network_packages_default_initscripts' from source: role '' defaults 18699 1726882362.56213: variable '__network_packages_default_initscripts' from source: role '' defaults 18699 1726882362.56451: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 18699 1726882362.57049: variable 'network_connections' from source: play vars 18699 1726882362.57063: variable 'profile' from source: play vars 18699 1726882362.57206: variable 'profile' from source: play vars 18699 1726882362.57209: variable 'interface' from source: set_fact 18699 1726882362.57256: variable 'interface' from source: set_fact 18699 1726882362.57269: variable 'ansible_distribution' from source: facts 18699 1726882362.57278: variable '__network_rh_distros' from source: role '' defaults 18699 1726882362.57291: variable 'ansible_distribution_major_version' from source: facts 18699 1726882362.57328: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 18699 1726882362.57605: variable 'ansible_distribution' from source: facts 18699 1726882362.57609: variable '__network_rh_distros' from source: role '' defaults 18699 1726882362.57612: variable 'ansible_distribution_major_version' from source: facts 18699 1726882362.57614: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 18699 1726882362.57799: variable 'ansible_distribution' from source: facts 18699 1726882362.57856: variable '__network_rh_distros' from source: role '' defaults 18699 1726882362.58071: variable 'ansible_distribution_major_version' from source: facts 18699 1726882362.58075: variable 'network_provider' from source: set_fact 18699 1726882362.58077: variable 'omit' from source: magic vars 18699 1726882362.58085: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18699 1726882362.58129: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18699 1726882362.58191: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18699 1726882362.58236: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882362.58278: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882362.58398: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18699 1726882362.58401: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882362.58405: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882362.58455: Set connection var ansible_connection to ssh 18699 1726882362.58468: Set connection var ansible_pipelining to False 18699 1726882362.58521: Set connection var ansible_shell_executable to /bin/sh 18699 1726882362.58524: Set connection var ansible_timeout to 10 18699 1726882362.58527: Set connection var ansible_shell_type to sh 18699 1726882362.58529: Set connection var ansible_module_compression to ZIP_DEFLATED 18699 1726882362.58541: variable 'ansible_shell_executable' from source: unknown 18699 1726882362.58549: variable 'ansible_connection' from source: unknown 18699 1726882362.58555: variable 'ansible_module_compression' from source: unknown 18699 1726882362.58562: variable 'ansible_shell_type' from source: unknown 18699 1726882362.58567: variable 'ansible_shell_executable' from source: unknown 18699 1726882362.58574: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882362.58587: variable 'ansible_pipelining' from source: unknown 18699 1726882362.58630: variable 'ansible_timeout' from source: unknown 18699 1726882362.58633: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882362.58717: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18699 1726882362.58737: variable 'omit' from source: magic vars 18699 1726882362.58747: starting attempt loop 18699 1726882362.58759: running the handler 18699 1726882362.58848: variable 'ansible_facts' from source: unknown 18699 1726882362.59899: _low_level_execute_command(): starting 18699 1726882362.59903: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18699 1726882362.60772: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882362.60776: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882362.60780: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882362.60783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18699 1726882362.60998: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882362.61001: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882362.61165: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882362.61169: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882362.62701: stdout chunk (state=3): >>>/root <<< 18699 1726882362.63062: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882362.63068: stdout chunk (state=3): >>><<< 18699 1726882362.63076: stderr chunk (state=3): >>><<< 18699 1726882362.63098: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882362.63142: _low_level_execute_command(): starting 18699 1726882362.63145: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882362.6312563-20440-14328720648237 `" && echo ansible-tmp-1726882362.6312563-20440-14328720648237="` echo /root/.ansible/tmp/ansible-tmp-1726882362.6312563-20440-14328720648237 `" ) && sleep 0' 18699 1726882362.64250: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882362.64269: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882362.64399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882362.64708: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882362.64781: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882362.66653: stdout chunk (state=3): >>>ansible-tmp-1726882362.6312563-20440-14328720648237=/root/.ansible/tmp/ansible-tmp-1726882362.6312563-20440-14328720648237 <<< 18699 1726882362.66785: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882362.66789: stdout chunk (state=3): >>><<< 18699 1726882362.66799: stderr chunk (state=3): >>><<< 18699 1726882362.66822: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882362.6312563-20440-14328720648237=/root/.ansible/tmp/ansible-tmp-1726882362.6312563-20440-14328720648237 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882362.66854: variable 'ansible_module_compression' from source: unknown 18699 1726882362.66902: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18699f6i6z5dg/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 18699 1726882362.67201: variable 'ansible_facts' from source: unknown 18699 1726882362.67475: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882362.6312563-20440-14328720648237/AnsiballZ_systemd.py 18699 1726882362.68071: Sending initial data 18699 1726882362.68074: Sent initial data (155 bytes) 18699 1726882362.68977: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882362.69190: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882362.69208: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882362.69222: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882362.69291: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882362.70797: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18699 1726882362.70847: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18699 1726882362.71090: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmpv2n5iklb /root/.ansible/tmp/ansible-tmp-1726882362.6312563-20440-14328720648237/AnsiballZ_systemd.py <<< 18699 1726882362.71103: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882362.6312563-20440-14328720648237/AnsiballZ_systemd.py" <<< 18699 1726882362.71107: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmpv2n5iklb" to remote "/root/.ansible/tmp/ansible-tmp-1726882362.6312563-20440-14328720648237/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882362.6312563-20440-14328720648237/AnsiballZ_systemd.py" <<< 18699 1726882362.73831: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882362.73835: stdout chunk (state=3): >>><<< 18699 1726882362.73849: stderr chunk (state=3): >>><<< 18699 1726882362.73934: done transferring module to remote 18699 1726882362.73948: _low_level_execute_command(): starting 18699 1726882362.74006: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882362.6312563-20440-14328720648237/ /root/.ansible/tmp/ansible-tmp-1726882362.6312563-20440-14328720648237/AnsiballZ_systemd.py && sleep 0' 18699 1726882362.75268: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882362.75453: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882362.75613: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882362.75708: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882362.77409: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882362.77617: stderr chunk (state=3): >>><<< 18699 1726882362.77622: stdout chunk (state=3): >>><<< 18699 1726882362.77712: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882362.77715: _low_level_execute_command(): starting 18699 1726882362.77718: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882362.6312563-20440-14328720648237/AnsiballZ_systemd.py && sleep 0' 18699 1726882362.78747: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882362.78833: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882362.78961: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882362.78977: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882362.79043: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882363.08038: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "701", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainStartTimestampMonotonic": "18353430", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainHandoffTimestampMonotonic": "18368765", "ExecMainPID": "701", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10756096", "MemoryPeak": "14331904", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3316310016", "EffectiveMemoryMax": "3702886400", "EffectiveMemoryHigh": "3702886400", "CPUUsageNSec": "1299719000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target NetworkManager-wait-online.service network.target cloud-init.service shutdown.target", "After": "basic.target system.slice sysinit.target systemd-journald.socket network-pre.target dbus-broker.service dbus.socket cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:37 EDT", "StateChangeTimestampMonotonic": "610814281", "InactiveExitTimestamp": "Fri 2024-09-20 21:19:45 EDT", "InactiveExitTimestampMonotonic": "18353817", "ActiveEnterTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ActiveEnterTimestampMonotonic": "18664782", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ConditionTimestampMonotonic": "18352589", "AssertTimestamp": "Fri 2024-09-20 21:19:45 EDT", "AssertTimestampMonotonic": "18352592", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ccc4619c603e4305b3d5044f460b1d5b", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 18699 1726882363.09598: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 18699 1726882363.09603: stdout chunk (state=3): >>><<< 18699 1726882363.09605: stderr chunk (state=3): >>><<< 18699 1726882363.09625: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "701", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainStartTimestampMonotonic": "18353430", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainHandoffTimestampMonotonic": "18368765", "ExecMainPID": "701", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10756096", "MemoryPeak": "14331904", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3316310016", "EffectiveMemoryMax": "3702886400", "EffectiveMemoryHigh": "3702886400", "CPUUsageNSec": "1299719000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target NetworkManager-wait-online.service network.target cloud-init.service shutdown.target", "After": "basic.target system.slice sysinit.target systemd-journald.socket network-pre.target dbus-broker.service dbus.socket cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:37 EDT", "StateChangeTimestampMonotonic": "610814281", "InactiveExitTimestamp": "Fri 2024-09-20 21:19:45 EDT", "InactiveExitTimestampMonotonic": "18353817", "ActiveEnterTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ActiveEnterTimestampMonotonic": "18664782", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ConditionTimestampMonotonic": "18352589", "AssertTimestamp": "Fri 2024-09-20 21:19:45 EDT", "AssertTimestampMonotonic": "18352592", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ccc4619c603e4305b3d5044f460b1d5b", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 18699 1726882363.09915: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882362.6312563-20440-14328720648237/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18699 1726882363.09933: _low_level_execute_command(): starting 18699 1726882363.09938: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882362.6312563-20440-14328720648237/ > /dev/null 2>&1 && sleep 0' 18699 1726882363.11115: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882363.11404: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882363.11417: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882363.11429: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882363.11495: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882363.13590: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882363.13700: stdout chunk (state=3): >>><<< 18699 1726882363.13703: stderr chunk (state=3): >>><<< 18699 1726882363.13706: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882363.13710: handler run complete 18699 1726882363.13712: attempt loop complete, returning result 18699 1726882363.13714: _execute() done 18699 1726882363.13716: dumping result to json 18699 1726882363.13718: done dumping result, returning 18699 1726882363.13720: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12673a56-9f93-1ce6-d207-000000000068] 18699 1726882363.13722: sending task result for task 12673a56-9f93-1ce6-d207-000000000068 18699 1726882363.14384: done sending task result for task 12673a56-9f93-1ce6-d207-000000000068 18699 1726882363.14388: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18699 1726882363.14445: no more pending results, returning what we have 18699 1726882363.14449: results queue empty 18699 1726882363.14450: checking for any_errors_fatal 18699 1726882363.14457: done checking for any_errors_fatal 18699 1726882363.14457: checking for max_fail_percentage 18699 1726882363.14459: done checking for max_fail_percentage 18699 1726882363.14460: checking to see if all hosts have failed and the running result is not ok 18699 1726882363.14461: done checking to see if all hosts have failed 18699 1726882363.14462: getting the remaining hosts for this loop 18699 1726882363.14463: done getting the remaining hosts for this loop 18699 1726882363.14467: getting the next task for host managed_node1 18699 1726882363.14473: done getting next task for host managed_node1 18699 1726882363.14478: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 18699 1726882363.14480: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882363.14490: getting variables 18699 1726882363.14516: in VariableManager get_vars() 18699 1726882363.14585: Calling all_inventory to load vars for managed_node1 18699 1726882363.14588: Calling groups_inventory to load vars for managed_node1 18699 1726882363.14591: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882363.14857: Calling all_plugins_play to load vars for managed_node1 18699 1726882363.14860: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882363.14864: Calling groups_plugins_play to load vars for managed_node1 18699 1726882363.18523: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882363.21843: done with get_vars() 18699 1726882363.21867: done getting variables 18699 1726882363.22132: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:32:43 -0400 (0:00:00.757) 0:00:36.817 ****** 18699 1726882363.22163: entering _queue_task() for managed_node1/service 18699 1726882363.23127: worker is 1 (out of 1 available) 18699 1726882363.23137: exiting _queue_task() for managed_node1/service 18699 1726882363.23147: done queuing things up, now waiting for results queue to drain 18699 1726882363.23148: waiting for pending results... 18699 1726882363.23813: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 18699 1726882363.23822: in run() - task 12673a56-9f93-1ce6-d207-000000000069 18699 1726882363.23826: variable 'ansible_search_path' from source: unknown 18699 1726882363.23828: variable 'ansible_search_path' from source: unknown 18699 1726882363.23831: calling self._execute() 18699 1726882363.23833: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882363.23905: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882363.23921: variable 'omit' from source: magic vars 18699 1726882363.24596: variable 'ansible_distribution_major_version' from source: facts 18699 1726882363.24712: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882363.25050: variable 'network_provider' from source: set_fact 18699 1726882363.25053: Evaluated conditional (network_provider == "nm"): True 18699 1726882363.25120: variable '__network_wpa_supplicant_required' from source: role '' defaults 18699 1726882363.25329: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 18699 1726882363.25817: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18699 1726882363.29908: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18699 1726882363.29985: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18699 1726882363.30089: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18699 1726882363.30198: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18699 1726882363.30230: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18699 1726882363.30360: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882363.30709: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882363.30712: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882363.30714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882363.30725: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882363.30775: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882363.30841: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882363.30871: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882363.30965: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882363.31050: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882363.31096: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882363.31167: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882363.31199: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882363.31290: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882363.31377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882363.31633: variable 'network_connections' from source: play vars 18699 1726882363.31705: variable 'profile' from source: play vars 18699 1726882363.31906: variable 'profile' from source: play vars 18699 1726882363.31909: variable 'interface' from source: set_fact 18699 1726882363.31968: variable 'interface' from source: set_fact 18699 1726882363.32085: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18699 1726882363.32506: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18699 1726882363.32546: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18699 1726882363.32584: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18699 1726882363.32699: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18699 1726882363.32746: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18699 1726882363.32802: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18699 1726882363.32913: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882363.32943: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18699 1726882363.33211: variable '__network_wireless_connections_defined' from source: role '' defaults 18699 1726882363.33488: variable 'network_connections' from source: play vars 18699 1726882363.33753: variable 'profile' from source: play vars 18699 1726882363.33756: variable 'profile' from source: play vars 18699 1726882363.33758: variable 'interface' from source: set_fact 18699 1726882363.33877: variable 'interface' from source: set_fact 18699 1726882363.33910: Evaluated conditional (__network_wpa_supplicant_required): False 18699 1726882363.33919: when evaluation is False, skipping this task 18699 1726882363.33927: _execute() done 18699 1726882363.33943: dumping result to json 18699 1726882363.33952: done dumping result, returning 18699 1726882363.33966: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12673a56-9f93-1ce6-d207-000000000069] 18699 1726882363.33976: sending task result for task 12673a56-9f93-1ce6-d207-000000000069 skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 18699 1726882363.34124: no more pending results, returning what we have 18699 1726882363.34127: results queue empty 18699 1726882363.34128: checking for any_errors_fatal 18699 1726882363.34149: done checking for any_errors_fatal 18699 1726882363.34150: checking for max_fail_percentage 18699 1726882363.34152: done checking for max_fail_percentage 18699 1726882363.34153: checking to see if all hosts have failed and the running result is not ok 18699 1726882363.34154: done checking to see if all hosts have failed 18699 1726882363.34154: getting the remaining hosts for this loop 18699 1726882363.34156: done getting the remaining hosts for this loop 18699 1726882363.34159: getting the next task for host managed_node1 18699 1726882363.34165: done getting next task for host managed_node1 18699 1726882363.34169: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 18699 1726882363.34171: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882363.34184: getting variables 18699 1726882363.34186: in VariableManager get_vars() 18699 1726882363.34228: Calling all_inventory to load vars for managed_node1 18699 1726882363.34231: Calling groups_inventory to load vars for managed_node1 18699 1726882363.34234: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882363.34245: Calling all_plugins_play to load vars for managed_node1 18699 1726882363.34248: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882363.34251: Calling groups_plugins_play to load vars for managed_node1 18699 1726882363.35200: done sending task result for task 12673a56-9f93-1ce6-d207-000000000069 18699 1726882363.35203: WORKER PROCESS EXITING 18699 1726882363.37405: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882363.40546: done with get_vars() 18699 1726882363.40576: done getting variables 18699 1726882363.40638: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:32:43 -0400 (0:00:00.185) 0:00:37.002 ****** 18699 1726882363.40668: entering _queue_task() for managed_node1/service 18699 1726882363.41429: worker is 1 (out of 1 available) 18699 1726882363.41439: exiting _queue_task() for managed_node1/service 18699 1726882363.41451: done queuing things up, now waiting for results queue to drain 18699 1726882363.41452: waiting for pending results... 18699 1726882363.42359: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 18699 1726882363.42518: in run() - task 12673a56-9f93-1ce6-d207-00000000006a 18699 1726882363.43001: variable 'ansible_search_path' from source: unknown 18699 1726882363.43005: variable 'ansible_search_path' from source: unknown 18699 1726882363.43009: calling self._execute() 18699 1726882363.43161: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882363.43172: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882363.43188: variable 'omit' from source: magic vars 18699 1726882363.44379: variable 'ansible_distribution_major_version' from source: facts 18699 1726882363.44433: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882363.44873: variable 'network_provider' from source: set_fact 18699 1726882363.44885: Evaluated conditional (network_provider == "initscripts"): False 18699 1726882363.45072: when evaluation is False, skipping this task 18699 1726882363.45075: _execute() done 18699 1726882363.45078: dumping result to json 18699 1726882363.45080: done dumping result, returning 18699 1726882363.45083: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [12673a56-9f93-1ce6-d207-00000000006a] 18699 1726882363.45086: sending task result for task 12673a56-9f93-1ce6-d207-00000000006a 18699 1726882363.45160: done sending task result for task 12673a56-9f93-1ce6-d207-00000000006a 18699 1726882363.45164: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18699 1726882363.45215: no more pending results, returning what we have 18699 1726882363.45220: results queue empty 18699 1726882363.45221: checking for any_errors_fatal 18699 1726882363.45232: done checking for any_errors_fatal 18699 1726882363.45233: checking for max_fail_percentage 18699 1726882363.45235: done checking for max_fail_percentage 18699 1726882363.45236: checking to see if all hosts have failed and the running result is not ok 18699 1726882363.45237: done checking to see if all hosts have failed 18699 1726882363.45237: getting the remaining hosts for this loop 18699 1726882363.45239: done getting the remaining hosts for this loop 18699 1726882363.45243: getting the next task for host managed_node1 18699 1726882363.45249: done getting next task for host managed_node1 18699 1726882363.45254: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 18699 1726882363.45257: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882363.45272: getting variables 18699 1726882363.45274: in VariableManager get_vars() 18699 1726882363.45318: Calling all_inventory to load vars for managed_node1 18699 1726882363.45322: Calling groups_inventory to load vars for managed_node1 18699 1726882363.45325: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882363.45338: Calling all_plugins_play to load vars for managed_node1 18699 1726882363.45341: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882363.45345: Calling groups_plugins_play to load vars for managed_node1 18699 1726882363.50636: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882363.53250: done with get_vars() 18699 1726882363.53334: done getting variables 18699 1726882363.53546: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:32:43 -0400 (0:00:00.129) 0:00:37.131 ****** 18699 1726882363.53579: entering _queue_task() for managed_node1/copy 18699 1726882363.54253: worker is 1 (out of 1 available) 18699 1726882363.54264: exiting _queue_task() for managed_node1/copy 18699 1726882363.54274: done queuing things up, now waiting for results queue to drain 18699 1726882363.54275: waiting for pending results... 18699 1726882363.54591: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 18699 1726882363.54691: in run() - task 12673a56-9f93-1ce6-d207-00000000006b 18699 1726882363.54700: variable 'ansible_search_path' from source: unknown 18699 1726882363.54703: variable 'ansible_search_path' from source: unknown 18699 1726882363.54706: calling self._execute() 18699 1726882363.54799: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882363.54817: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882363.54835: variable 'omit' from source: magic vars 18699 1726882363.55499: variable 'ansible_distribution_major_version' from source: facts 18699 1726882363.55502: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882363.55610: variable 'network_provider' from source: set_fact 18699 1726882363.55622: Evaluated conditional (network_provider == "initscripts"): False 18699 1726882363.55999: when evaluation is False, skipping this task 18699 1726882363.56003: _execute() done 18699 1726882363.56006: dumping result to json 18699 1726882363.56008: done dumping result, returning 18699 1726882363.56011: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12673a56-9f93-1ce6-d207-00000000006b] 18699 1726882363.56013: sending task result for task 12673a56-9f93-1ce6-d207-00000000006b 18699 1726882363.56084: done sending task result for task 12673a56-9f93-1ce6-d207-00000000006b 18699 1726882363.56087: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 18699 1726882363.56138: no more pending results, returning what we have 18699 1726882363.56142: results queue empty 18699 1726882363.56143: checking for any_errors_fatal 18699 1726882363.56147: done checking for any_errors_fatal 18699 1726882363.56148: checking for max_fail_percentage 18699 1726882363.56151: done checking for max_fail_percentage 18699 1726882363.56152: checking to see if all hosts have failed and the running result is not ok 18699 1726882363.56152: done checking to see if all hosts have failed 18699 1726882363.56153: getting the remaining hosts for this loop 18699 1726882363.56154: done getting the remaining hosts for this loop 18699 1726882363.56158: getting the next task for host managed_node1 18699 1726882363.56163: done getting next task for host managed_node1 18699 1726882363.56167: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 18699 1726882363.56169: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882363.56183: getting variables 18699 1726882363.56185: in VariableManager get_vars() 18699 1726882363.56226: Calling all_inventory to load vars for managed_node1 18699 1726882363.56229: Calling groups_inventory to load vars for managed_node1 18699 1726882363.56232: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882363.56243: Calling all_plugins_play to load vars for managed_node1 18699 1726882363.56246: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882363.56250: Calling groups_plugins_play to load vars for managed_node1 18699 1726882363.60059: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882363.62368: done with get_vars() 18699 1726882363.62402: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:32:43 -0400 (0:00:00.089) 0:00:37.221 ****** 18699 1726882363.62499: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 18699 1726882363.62923: worker is 1 (out of 1 available) 18699 1726882363.62936: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 18699 1726882363.62948: done queuing things up, now waiting for results queue to drain 18699 1726882363.62948: waiting for pending results... 18699 1726882363.63609: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 18699 1726882363.63690: in run() - task 12673a56-9f93-1ce6-d207-00000000006c 18699 1726882363.63955: variable 'ansible_search_path' from source: unknown 18699 1726882363.63958: variable 'ansible_search_path' from source: unknown 18699 1726882363.63961: calling self._execute() 18699 1726882363.64050: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882363.64398: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882363.64403: variable 'omit' from source: magic vars 18699 1726882363.65401: variable 'ansible_distribution_major_version' from source: facts 18699 1726882363.65478: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882363.65491: variable 'omit' from source: magic vars 18699 1726882363.65880: variable 'omit' from source: magic vars 18699 1726882363.66225: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18699 1726882363.73140: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18699 1726882363.73318: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18699 1726882363.73406: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18699 1726882363.73501: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18699 1726882363.73530: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18699 1726882363.73767: variable 'network_provider' from source: set_fact 18699 1726882363.74201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18699 1726882363.74205: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18699 1726882363.74207: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18699 1726882363.74361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18699 1726882363.74378: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18699 1726882363.74516: variable 'omit' from source: magic vars 18699 1726882363.74635: variable 'omit' from source: magic vars 18699 1726882363.75105: variable 'network_connections' from source: play vars 18699 1726882363.75117: variable 'profile' from source: play vars 18699 1726882363.75189: variable 'profile' from source: play vars 18699 1726882363.75196: variable 'interface' from source: set_fact 18699 1726882363.75373: variable 'interface' from source: set_fact 18699 1726882363.75635: variable 'omit' from source: magic vars 18699 1726882363.75645: variable '__lsr_ansible_managed' from source: task vars 18699 1726882363.75825: variable '__lsr_ansible_managed' from source: task vars 18699 1726882363.76722: Loaded config def from plugin (lookup/template) 18699 1726882363.76725: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 18699 1726882363.76766: File lookup term: get_ansible_managed.j2 18699 1726882363.76890: variable 'ansible_search_path' from source: unknown 18699 1726882363.76899: evaluation_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 18699 1726882363.76921: search_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 18699 1726882363.76933: variable 'ansible_search_path' from source: unknown 18699 1726882363.96429: variable 'ansible_managed' from source: unknown 18699 1726882363.96642: variable 'omit' from source: magic vars 18699 1726882363.96873: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18699 1726882363.96877: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18699 1726882363.96880: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18699 1726882363.96882: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882363.96884: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882363.96886: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18699 1726882363.96889: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882363.96891: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882363.96960: Set connection var ansible_connection to ssh 18699 1726882363.96966: Set connection var ansible_pipelining to False 18699 1726882363.96976: Set connection var ansible_shell_executable to /bin/sh 18699 1726882363.96980: Set connection var ansible_timeout to 10 18699 1726882363.96982: Set connection var ansible_shell_type to sh 18699 1726882363.96984: Set connection var ansible_module_compression to ZIP_DEFLATED 18699 1726882363.97021: variable 'ansible_shell_executable' from source: unknown 18699 1726882363.97024: variable 'ansible_connection' from source: unknown 18699 1726882363.97026: variable 'ansible_module_compression' from source: unknown 18699 1726882363.97088: variable 'ansible_shell_type' from source: unknown 18699 1726882363.97095: variable 'ansible_shell_executable' from source: unknown 18699 1726882363.97098: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882363.97100: variable 'ansible_pipelining' from source: unknown 18699 1726882363.97103: variable 'ansible_timeout' from source: unknown 18699 1726882363.97105: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882363.97421: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18699 1726882363.97431: variable 'omit' from source: magic vars 18699 1726882363.97434: starting attempt loop 18699 1726882363.97436: running the handler 18699 1726882363.97438: _low_level_execute_command(): starting 18699 1726882363.97440: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18699 1726882363.99283: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882363.99287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882363.99303: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882363.99305: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882363.99325: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882363.99499: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882364.01091: stdout chunk (state=3): >>>/root <<< 18699 1726882364.01235: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882364.01238: stdout chunk (state=3): >>><<< 18699 1726882364.01248: stderr chunk (state=3): >>><<< 18699 1726882364.01316: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882364.01330: _low_level_execute_command(): starting 18699 1726882364.01336: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882364.0131667-20485-241750977426327 `" && echo ansible-tmp-1726882364.0131667-20485-241750977426327="` echo /root/.ansible/tmp/ansible-tmp-1726882364.0131667-20485-241750977426327 `" ) && sleep 0' 18699 1726882364.02327: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882364.02379: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882364.02390: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882364.02419: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18699 1726882364.02433: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 18699 1726882364.02439: stderr chunk (state=3): >>>debug2: match not found <<< 18699 1726882364.02449: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882364.02467: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18699 1726882364.02474: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 18699 1726882364.02483: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 18699 1726882364.02497: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882364.02542: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882364.02643: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882364.02653: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882364.02757: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882364.04642: stdout chunk (state=3): >>>ansible-tmp-1726882364.0131667-20485-241750977426327=/root/.ansible/tmp/ansible-tmp-1726882364.0131667-20485-241750977426327 <<< 18699 1726882364.04900: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882364.04906: stdout chunk (state=3): >>><<< 18699 1726882364.04908: stderr chunk (state=3): >>><<< 18699 1726882364.04911: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882364.0131667-20485-241750977426327=/root/.ansible/tmp/ansible-tmp-1726882364.0131667-20485-241750977426327 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882364.04922: variable 'ansible_module_compression' from source: unknown 18699 1726882364.04963: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18699f6i6z5dg/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 18699 1726882364.05038: variable 'ansible_facts' from source: unknown 18699 1726882364.05187: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882364.0131667-20485-241750977426327/AnsiballZ_network_connections.py 18699 1726882364.05345: Sending initial data 18699 1726882364.05349: Sent initial data (168 bytes) 18699 1726882364.06127: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 18699 1726882364.06219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882364.06233: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882364.06282: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882364.06364: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882364.07852: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 18699 1726882364.07859: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 18699 1726882364.07874: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18699 1726882364.07943: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18699 1726882364.08008: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmp3w4y86ck /root/.ansible/tmp/ansible-tmp-1726882364.0131667-20485-241750977426327/AnsiballZ_network_connections.py <<< 18699 1726882364.08011: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882364.0131667-20485-241750977426327/AnsiballZ_network_connections.py" <<< 18699 1726882364.08058: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmp3w4y86ck" to remote "/root/.ansible/tmp/ansible-tmp-1726882364.0131667-20485-241750977426327/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882364.0131667-20485-241750977426327/AnsiballZ_network_connections.py" <<< 18699 1726882364.09069: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882364.09127: stderr chunk (state=3): >>><<< 18699 1726882364.09136: stdout chunk (state=3): >>><<< 18699 1726882364.09167: done transferring module to remote 18699 1726882364.09201: _low_level_execute_command(): starting 18699 1726882364.09273: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882364.0131667-20485-241750977426327/ /root/.ansible/tmp/ansible-tmp-1726882364.0131667-20485-241750977426327/AnsiballZ_network_connections.py && sleep 0' 18699 1726882364.09826: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882364.09917: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882364.09922: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882364.09924: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 18699 1726882364.09926: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882364.09951: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882364.10076: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882364.10383: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882364.11909: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882364.11936: stderr chunk (state=3): >>><<< 18699 1726882364.11946: stdout chunk (state=3): >>><<< 18699 1726882364.11966: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882364.11974: _low_level_execute_command(): starting 18699 1726882364.11982: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882364.0131667-20485-241750977426327/AnsiballZ_network_connections.py && sleep 0' 18699 1726882364.12522: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882364.12536: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882364.12551: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882364.12568: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18699 1726882364.12585: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 18699 1726882364.12600: stderr chunk (state=3): >>>debug2: match not found <<< 18699 1726882364.12614: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882364.12632: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18699 1726882364.12643: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 18699 1726882364.12653: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 18699 1726882364.12711: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882364.12756: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882364.12772: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882364.12797: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882364.12873: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882364.39327: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_s60wmobi/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_s60wmobi/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on lsr27/a5a9140a-b936-48d0-9f96-c02df457936c: error=unknown <<< 18699 1726882364.39480: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 18699 1726882364.41134: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 18699 1726882364.41156: stderr chunk (state=3): >>><<< 18699 1726882364.41159: stdout chunk (state=3): >>><<< 18699 1726882364.41174: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_s60wmobi/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_s60wmobi/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on lsr27/a5a9140a-b936-48d0-9f96-c02df457936c: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 18699 1726882364.41210: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'lsr27', 'persistent_state': 'absent'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882364.0131667-20485-241750977426327/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18699 1726882364.41216: _low_level_execute_command(): starting 18699 1726882364.41221: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882364.0131667-20485-241750977426327/ > /dev/null 2>&1 && sleep 0' 18699 1726882364.41669: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882364.41672: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882364.41675: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882364.41677: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882364.41728: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882364.41732: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882364.41735: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882364.41778: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882364.43562: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882364.43589: stderr chunk (state=3): >>><<< 18699 1726882364.43592: stdout chunk (state=3): >>><<< 18699 1726882364.43616: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882364.43625: handler run complete 18699 1726882364.43640: attempt loop complete, returning result 18699 1726882364.43658: _execute() done 18699 1726882364.43661: dumping result to json 18699 1726882364.43667: done dumping result, returning 18699 1726882364.43671: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12673a56-9f93-1ce6-d207-00000000006c] 18699 1726882364.43706: sending task result for task 12673a56-9f93-1ce6-d207-00000000006c 18699 1726882364.43808: done sending task result for task 12673a56-9f93-1ce6-d207-00000000006c 18699 1726882364.43810: WORKER PROCESS EXITING changed: [managed_node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "lsr27", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 18699 1726882364.44004: no more pending results, returning what we have 18699 1726882364.44008: results queue empty 18699 1726882364.44009: checking for any_errors_fatal 18699 1726882364.44073: done checking for any_errors_fatal 18699 1726882364.44075: checking for max_fail_percentage 18699 1726882364.44077: done checking for max_fail_percentage 18699 1726882364.44078: checking to see if all hosts have failed and the running result is not ok 18699 1726882364.44079: done checking to see if all hosts have failed 18699 1726882364.44079: getting the remaining hosts for this loop 18699 1726882364.44081: done getting the remaining hosts for this loop 18699 1726882364.44084: getting the next task for host managed_node1 18699 1726882364.44091: done getting next task for host managed_node1 18699 1726882364.44097: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 18699 1726882364.44099: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882364.44109: getting variables 18699 1726882364.44110: in VariableManager get_vars() 18699 1726882364.44207: Calling all_inventory to load vars for managed_node1 18699 1726882364.44210: Calling groups_inventory to load vars for managed_node1 18699 1726882364.44212: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882364.44222: Calling all_plugins_play to load vars for managed_node1 18699 1726882364.44224: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882364.44229: Calling groups_plugins_play to load vars for managed_node1 18699 1726882364.45637: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882364.47175: done with get_vars() 18699 1726882364.47211: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:32:44 -0400 (0:00:00.848) 0:00:38.069 ****** 18699 1726882364.47340: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 18699 1726882364.47691: worker is 1 (out of 1 available) 18699 1726882364.47707: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 18699 1726882364.47718: done queuing things up, now waiting for results queue to drain 18699 1726882364.47719: waiting for pending results... 18699 1726882364.47888: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 18699 1726882364.47964: in run() - task 12673a56-9f93-1ce6-d207-00000000006d 18699 1726882364.47977: variable 'ansible_search_path' from source: unknown 18699 1726882364.47980: variable 'ansible_search_path' from source: unknown 18699 1726882364.48011: calling self._execute() 18699 1726882364.48105: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882364.48108: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882364.48154: variable 'omit' from source: magic vars 18699 1726882364.48479: variable 'ansible_distribution_major_version' from source: facts 18699 1726882364.48599: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882364.48602: variable 'network_state' from source: role '' defaults 18699 1726882364.48799: Evaluated conditional (network_state != {}): False 18699 1726882364.48802: when evaluation is False, skipping this task 18699 1726882364.48804: _execute() done 18699 1726882364.48806: dumping result to json 18699 1726882364.48807: done dumping result, returning 18699 1726882364.48809: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [12673a56-9f93-1ce6-d207-00000000006d] 18699 1726882364.48811: sending task result for task 12673a56-9f93-1ce6-d207-00000000006d 18699 1726882364.48868: done sending task result for task 12673a56-9f93-1ce6-d207-00000000006d 18699 1726882364.48871: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 18699 1726882364.48914: no more pending results, returning what we have 18699 1726882364.48917: results queue empty 18699 1726882364.48918: checking for any_errors_fatal 18699 1726882364.48925: done checking for any_errors_fatal 18699 1726882364.48926: checking for max_fail_percentage 18699 1726882364.48927: done checking for max_fail_percentage 18699 1726882364.48928: checking to see if all hosts have failed and the running result is not ok 18699 1726882364.48928: done checking to see if all hosts have failed 18699 1726882364.48929: getting the remaining hosts for this loop 18699 1726882364.48930: done getting the remaining hosts for this loop 18699 1726882364.48933: getting the next task for host managed_node1 18699 1726882364.48938: done getting next task for host managed_node1 18699 1726882364.48941: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 18699 1726882364.48943: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882364.48955: getting variables 18699 1726882364.48956: in VariableManager get_vars() 18699 1726882364.48987: Calling all_inventory to load vars for managed_node1 18699 1726882364.48990: Calling groups_inventory to load vars for managed_node1 18699 1726882364.49002: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882364.49011: Calling all_plugins_play to load vars for managed_node1 18699 1726882364.49014: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882364.49017: Calling groups_plugins_play to load vars for managed_node1 18699 1726882364.50760: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882364.52530: done with get_vars() 18699 1726882364.52567: done getting variables 18699 1726882364.52638: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:32:44 -0400 (0:00:00.053) 0:00:38.122 ****** 18699 1726882364.52686: entering _queue_task() for managed_node1/debug 18699 1726882364.53343: worker is 1 (out of 1 available) 18699 1726882364.53355: exiting _queue_task() for managed_node1/debug 18699 1726882364.53374: done queuing things up, now waiting for results queue to drain 18699 1726882364.53375: waiting for pending results... 18699 1726882364.53812: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 18699 1726882364.53818: in run() - task 12673a56-9f93-1ce6-d207-00000000006e 18699 1726882364.53820: variable 'ansible_search_path' from source: unknown 18699 1726882364.53826: variable 'ansible_search_path' from source: unknown 18699 1726882364.53923: calling self._execute() 18699 1726882364.53983: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882364.54000: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882364.54016: variable 'omit' from source: magic vars 18699 1726882364.54488: variable 'ansible_distribution_major_version' from source: facts 18699 1726882364.54516: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882364.54528: variable 'omit' from source: magic vars 18699 1726882364.54591: variable 'omit' from source: magic vars 18699 1726882364.54688: variable 'omit' from source: magic vars 18699 1726882364.54697: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18699 1726882364.54733: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18699 1726882364.54760: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18699 1726882364.54780: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882364.54817: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882364.54856: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18699 1726882364.54906: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882364.54914: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882364.54998: Set connection var ansible_connection to ssh 18699 1726882364.55025: Set connection var ansible_pipelining to False 18699 1726882364.55036: Set connection var ansible_shell_executable to /bin/sh 18699 1726882364.55045: Set connection var ansible_timeout to 10 18699 1726882364.55050: Set connection var ansible_shell_type to sh 18699 1726882364.55098: Set connection var ansible_module_compression to ZIP_DEFLATED 18699 1726882364.55101: variable 'ansible_shell_executable' from source: unknown 18699 1726882364.55103: variable 'ansible_connection' from source: unknown 18699 1726882364.55105: variable 'ansible_module_compression' from source: unknown 18699 1726882364.55107: variable 'ansible_shell_type' from source: unknown 18699 1726882364.55108: variable 'ansible_shell_executable' from source: unknown 18699 1726882364.55110: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882364.55131: variable 'ansible_pipelining' from source: unknown 18699 1726882364.55139: variable 'ansible_timeout' from source: unknown 18699 1726882364.55146: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882364.55325: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18699 1726882364.55359: variable 'omit' from source: magic vars 18699 1726882364.55447: starting attempt loop 18699 1726882364.55450: running the handler 18699 1726882364.55532: variable '__network_connections_result' from source: set_fact 18699 1726882364.55605: handler run complete 18699 1726882364.55628: attempt loop complete, returning result 18699 1726882364.55664: _execute() done 18699 1726882364.55667: dumping result to json 18699 1726882364.55675: done dumping result, returning 18699 1726882364.55680: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12673a56-9f93-1ce6-d207-00000000006e] 18699 1726882364.55682: sending task result for task 12673a56-9f93-1ce6-d207-00000000006e 18699 1726882364.56014: done sending task result for task 12673a56-9f93-1ce6-d207-00000000006e 18699 1726882364.56018: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result.stderr_lines": [ "" ] } 18699 1726882364.56081: no more pending results, returning what we have 18699 1726882364.56084: results queue empty 18699 1726882364.56085: checking for any_errors_fatal 18699 1726882364.56090: done checking for any_errors_fatal 18699 1726882364.56091: checking for max_fail_percentage 18699 1726882364.56097: done checking for max_fail_percentage 18699 1726882364.56098: checking to see if all hosts have failed and the running result is not ok 18699 1726882364.56099: done checking to see if all hosts have failed 18699 1726882364.56100: getting the remaining hosts for this loop 18699 1726882364.56102: done getting the remaining hosts for this loop 18699 1726882364.56105: getting the next task for host managed_node1 18699 1726882364.56111: done getting next task for host managed_node1 18699 1726882364.56115: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 18699 1726882364.56117: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882364.56126: getting variables 18699 1726882364.56136: in VariableManager get_vars() 18699 1726882364.56175: Calling all_inventory to load vars for managed_node1 18699 1726882364.56178: Calling groups_inventory to load vars for managed_node1 18699 1726882364.56181: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882364.56191: Calling all_plugins_play to load vars for managed_node1 18699 1726882364.56203: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882364.56207: Calling groups_plugins_play to load vars for managed_node1 18699 1726882364.62741: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882364.64461: done with get_vars() 18699 1726882364.64490: done getting variables 18699 1726882364.64545: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:32:44 -0400 (0:00:00.118) 0:00:38.241 ****** 18699 1726882364.64573: entering _queue_task() for managed_node1/debug 18699 1726882364.64934: worker is 1 (out of 1 available) 18699 1726882364.64947: exiting _queue_task() for managed_node1/debug 18699 1726882364.64959: done queuing things up, now waiting for results queue to drain 18699 1726882364.64960: waiting for pending results... 18699 1726882364.65312: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 18699 1726882364.65355: in run() - task 12673a56-9f93-1ce6-d207-00000000006f 18699 1726882364.65369: variable 'ansible_search_path' from source: unknown 18699 1726882364.65372: variable 'ansible_search_path' from source: unknown 18699 1726882364.65451: calling self._execute() 18699 1726882364.65515: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882364.65521: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882364.65560: variable 'omit' from source: magic vars 18699 1726882364.66180: variable 'ansible_distribution_major_version' from source: facts 18699 1726882364.66184: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882364.66186: variable 'omit' from source: magic vars 18699 1726882364.66188: variable 'omit' from source: magic vars 18699 1726882364.66211: variable 'omit' from source: magic vars 18699 1726882364.66251: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18699 1726882364.66349: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18699 1726882364.66353: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18699 1726882364.66355: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882364.66358: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882364.66406: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18699 1726882364.66410: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882364.66413: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882364.66474: Set connection var ansible_connection to ssh 18699 1726882364.66482: Set connection var ansible_pipelining to False 18699 1726882364.66487: Set connection var ansible_shell_executable to /bin/sh 18699 1726882364.66497: Set connection var ansible_timeout to 10 18699 1726882364.66501: Set connection var ansible_shell_type to sh 18699 1726882364.66708: Set connection var ansible_module_compression to ZIP_DEFLATED 18699 1726882364.66734: variable 'ansible_shell_executable' from source: unknown 18699 1726882364.66738: variable 'ansible_connection' from source: unknown 18699 1726882364.66741: variable 'ansible_module_compression' from source: unknown 18699 1726882364.66743: variable 'ansible_shell_type' from source: unknown 18699 1726882364.66746: variable 'ansible_shell_executable' from source: unknown 18699 1726882364.66748: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882364.66750: variable 'ansible_pipelining' from source: unknown 18699 1726882364.66752: variable 'ansible_timeout' from source: unknown 18699 1726882364.66784: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882364.66897: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18699 1726882364.67021: variable 'omit' from source: magic vars 18699 1726882364.67024: starting attempt loop 18699 1726882364.67027: running the handler 18699 1726882364.67144: variable '__network_connections_result' from source: set_fact 18699 1726882364.67254: variable '__network_connections_result' from source: set_fact 18699 1726882364.67465: handler run complete 18699 1726882364.67488: attempt loop complete, returning result 18699 1726882364.67495: _execute() done 18699 1726882364.67498: dumping result to json 18699 1726882364.67658: done dumping result, returning 18699 1726882364.67661: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12673a56-9f93-1ce6-d207-00000000006f] 18699 1726882364.67665: sending task result for task 12673a56-9f93-1ce6-d207-00000000006f 18699 1726882364.67743: done sending task result for task 12673a56-9f93-1ce6-d207-00000000006f 18699 1726882364.67746: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "lsr27", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 18699 1726882364.67856: no more pending results, returning what we have 18699 1726882364.67860: results queue empty 18699 1726882364.67861: checking for any_errors_fatal 18699 1726882364.67869: done checking for any_errors_fatal 18699 1726882364.67870: checking for max_fail_percentage 18699 1726882364.67872: done checking for max_fail_percentage 18699 1726882364.67873: checking to see if all hosts have failed and the running result is not ok 18699 1726882364.67873: done checking to see if all hosts have failed 18699 1726882364.67874: getting the remaining hosts for this loop 18699 1726882364.67875: done getting the remaining hosts for this loop 18699 1726882364.67880: getting the next task for host managed_node1 18699 1726882364.67886: done getting next task for host managed_node1 18699 1726882364.67890: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 18699 1726882364.67892: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882364.67905: getting variables 18699 1726882364.67907: in VariableManager get_vars() 18699 1726882364.67946: Calling all_inventory to load vars for managed_node1 18699 1726882364.67950: Calling groups_inventory to load vars for managed_node1 18699 1726882364.67953: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882364.67963: Calling all_plugins_play to load vars for managed_node1 18699 1726882364.67967: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882364.67970: Calling groups_plugins_play to load vars for managed_node1 18699 1726882364.69589: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882364.72110: done with get_vars() 18699 1726882364.72133: done getting variables 18699 1726882364.72196: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:32:44 -0400 (0:00:00.076) 0:00:38.318 ****** 18699 1726882364.72234: entering _queue_task() for managed_node1/debug 18699 1726882364.72584: worker is 1 (out of 1 available) 18699 1726882364.72699: exiting _queue_task() for managed_node1/debug 18699 1726882364.72709: done queuing things up, now waiting for results queue to drain 18699 1726882364.72710: waiting for pending results... 18699 1726882364.73218: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 18699 1726882364.73223: in run() - task 12673a56-9f93-1ce6-d207-000000000070 18699 1726882364.73226: variable 'ansible_search_path' from source: unknown 18699 1726882364.73229: variable 'ansible_search_path' from source: unknown 18699 1726882364.73231: calling self._execute() 18699 1726882364.73603: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882364.73606: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882364.73608: variable 'omit' from source: magic vars 18699 1726882364.74461: variable 'ansible_distribution_major_version' from source: facts 18699 1726882364.74465: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882364.74561: variable 'network_state' from source: role '' defaults 18699 1726882364.74589: Evaluated conditional (network_state != {}): False 18699 1726882364.74631: when evaluation is False, skipping this task 18699 1726882364.74639: _execute() done 18699 1726882364.74646: dumping result to json 18699 1726882364.74659: done dumping result, returning 18699 1726882364.74671: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12673a56-9f93-1ce6-d207-000000000070] 18699 1726882364.74683: sending task result for task 12673a56-9f93-1ce6-d207-000000000070 skipping: [managed_node1] => { "false_condition": "network_state != {}" } 18699 1726882364.74828: no more pending results, returning what we have 18699 1726882364.74832: results queue empty 18699 1726882364.74835: checking for any_errors_fatal 18699 1726882364.74843: done checking for any_errors_fatal 18699 1726882364.74844: checking for max_fail_percentage 18699 1726882364.74846: done checking for max_fail_percentage 18699 1726882364.74846: checking to see if all hosts have failed and the running result is not ok 18699 1726882364.74847: done checking to see if all hosts have failed 18699 1726882364.74848: getting the remaining hosts for this loop 18699 1726882364.74849: done getting the remaining hosts for this loop 18699 1726882364.74853: getting the next task for host managed_node1 18699 1726882364.74859: done getting next task for host managed_node1 18699 1726882364.74863: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 18699 1726882364.74865: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882364.74879: getting variables 18699 1726882364.74881: in VariableManager get_vars() 18699 1726882364.74921: Calling all_inventory to load vars for managed_node1 18699 1726882364.74924: Calling groups_inventory to load vars for managed_node1 18699 1726882364.74926: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882364.74938: Calling all_plugins_play to load vars for managed_node1 18699 1726882364.74941: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882364.74944: Calling groups_plugins_play to load vars for managed_node1 18699 1726882364.75706: done sending task result for task 12673a56-9f93-1ce6-d207-000000000070 18699 1726882364.75709: WORKER PROCESS EXITING 18699 1726882364.76605: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882364.78297: done with get_vars() 18699 1726882364.78325: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:32:44 -0400 (0:00:00.063) 0:00:38.381 ****** 18699 1726882364.78544: entering _queue_task() for managed_node1/ping 18699 1726882364.79295: worker is 1 (out of 1 available) 18699 1726882364.79309: exiting _queue_task() for managed_node1/ping 18699 1726882364.79320: done queuing things up, now waiting for results queue to drain 18699 1726882364.79321: waiting for pending results... 18699 1726882364.79839: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 18699 1726882364.79965: in run() - task 12673a56-9f93-1ce6-d207-000000000071 18699 1726882364.79985: variable 'ansible_search_path' from source: unknown 18699 1726882364.79991: variable 'ansible_search_path' from source: unknown 18699 1726882364.80038: calling self._execute() 18699 1726882364.80142: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882364.80152: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882364.80166: variable 'omit' from source: magic vars 18699 1726882364.80553: variable 'ansible_distribution_major_version' from source: facts 18699 1726882364.80575: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882364.80586: variable 'omit' from source: magic vars 18699 1726882364.80639: variable 'omit' from source: magic vars 18699 1726882364.80683: variable 'omit' from source: magic vars 18699 1726882364.80729: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18699 1726882364.80768: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18699 1726882364.80799: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18699 1726882364.80822: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882364.80839: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882364.80872: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18699 1726882364.80879: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882364.80891: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882364.80988: Set connection var ansible_connection to ssh 18699 1726882364.81107: Set connection var ansible_pipelining to False 18699 1726882364.81111: Set connection var ansible_shell_executable to /bin/sh 18699 1726882364.81113: Set connection var ansible_timeout to 10 18699 1726882364.81115: Set connection var ansible_shell_type to sh 18699 1726882364.81117: Set connection var ansible_module_compression to ZIP_DEFLATED 18699 1726882364.81119: variable 'ansible_shell_executable' from source: unknown 18699 1726882364.81120: variable 'ansible_connection' from source: unknown 18699 1726882364.81123: variable 'ansible_module_compression' from source: unknown 18699 1726882364.81124: variable 'ansible_shell_type' from source: unknown 18699 1726882364.81127: variable 'ansible_shell_executable' from source: unknown 18699 1726882364.81128: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882364.81131: variable 'ansible_pipelining' from source: unknown 18699 1726882364.81133: variable 'ansible_timeout' from source: unknown 18699 1726882364.81135: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882364.81353: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18699 1726882364.81370: variable 'omit' from source: magic vars 18699 1726882364.81380: starting attempt loop 18699 1726882364.81387: running the handler 18699 1726882364.81411: _low_level_execute_command(): starting 18699 1726882364.81435: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18699 1726882364.82169: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882364.82187: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882364.82210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882364.82318: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882364.82338: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882364.82424: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882364.84053: stdout chunk (state=3): >>>/root <<< 18699 1726882364.84148: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882364.84376: stderr chunk (state=3): >>><<< 18699 1726882364.84379: stdout chunk (state=3): >>><<< 18699 1726882364.84382: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882364.84384: _low_level_execute_command(): starting 18699 1726882364.84387: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882364.8430614-20534-228486720374008 `" && echo ansible-tmp-1726882364.8430614-20534-228486720374008="` echo /root/.ansible/tmp/ansible-tmp-1726882364.8430614-20534-228486720374008 `" ) && sleep 0' 18699 1726882364.86315: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882364.86660: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882364.86817: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882364.88679: stdout chunk (state=3): >>>ansible-tmp-1726882364.8430614-20534-228486720374008=/root/.ansible/tmp/ansible-tmp-1726882364.8430614-20534-228486720374008 <<< 18699 1726882364.89268: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882364.89272: stdout chunk (state=3): >>><<< 18699 1726882364.89274: stderr chunk (state=3): >>><<< 18699 1726882364.89276: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882364.8430614-20534-228486720374008=/root/.ansible/tmp/ansible-tmp-1726882364.8430614-20534-228486720374008 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882364.89279: variable 'ansible_module_compression' from source: unknown 18699 1726882364.89323: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18699f6i6z5dg/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 18699 1726882364.89358: variable 'ansible_facts' from source: unknown 18699 1726882364.89559: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882364.8430614-20534-228486720374008/AnsiballZ_ping.py 18699 1726882364.90017: Sending initial data 18699 1726882364.90021: Sent initial data (153 bytes) 18699 1726882364.91129: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882364.91250: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882364.91292: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882364.91387: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882364.91427: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882364.92968: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18699 1726882364.93041: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18699 1726882364.93209: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmphqzqxhbd /root/.ansible/tmp/ansible-tmp-1726882364.8430614-20534-228486720374008/AnsiballZ_ping.py <<< 18699 1726882364.93214: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882364.8430614-20534-228486720374008/AnsiballZ_ping.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmphqzqxhbd" to remote "/root/.ansible/tmp/ansible-tmp-1726882364.8430614-20534-228486720374008/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882364.8430614-20534-228486720374008/AnsiballZ_ping.py" <<< 18699 1726882364.95115: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882364.95119: stderr chunk (state=3): >>><<< 18699 1726882364.95121: stdout chunk (state=3): >>><<< 18699 1726882364.95147: done transferring module to remote 18699 1726882364.95274: _low_level_execute_command(): starting 18699 1726882364.95282: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882364.8430614-20534-228486720374008/ /root/.ansible/tmp/ansible-tmp-1726882364.8430614-20534-228486720374008/AnsiballZ_ping.py && sleep 0' 18699 1726882364.96964: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882364.96969: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 18699 1726882364.97011: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882364.97165: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882364.97214: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882364.97273: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882364.99062: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882364.99065: stdout chunk (state=3): >>><<< 18699 1726882364.99072: stderr chunk (state=3): >>><<< 18699 1726882364.99412: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882364.99419: _low_level_execute_command(): starting 18699 1726882364.99422: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882364.8430614-20534-228486720374008/AnsiballZ_ping.py && sleep 0' 18699 1726882365.00144: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882365.00153: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882365.00165: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882365.00178: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18699 1726882365.00213: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 18699 1726882365.00220: stderr chunk (state=3): >>>debug2: match not found <<< 18699 1726882365.00229: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882365.00410: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882365.00436: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882365.00447: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882365.00464: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882365.00538: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882365.15354: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 18699 1726882365.16803: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882365.16807: stderr chunk (state=3): >>>Shared connection to 10.31.9.159 closed. <<< 18699 1726882365.16810: stdout chunk (state=3): >>><<< 18699 1726882365.16812: stderr chunk (state=3): >>><<< 18699 1726882365.16814: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 18699 1726882365.16817: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882364.8430614-20534-228486720374008/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18699 1726882365.16820: _low_level_execute_command(): starting 18699 1726882365.16832: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882364.8430614-20534-228486720374008/ > /dev/null 2>&1 && sleep 0' 18699 1726882365.17612: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882365.17665: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882365.17683: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882365.17713: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882365.17789: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882365.19688: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882365.19705: stdout chunk (state=3): >>><<< 18699 1726882365.19717: stderr chunk (state=3): >>><<< 18699 1726882365.19738: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882365.19750: handler run complete 18699 1726882365.19770: attempt loop complete, returning result 18699 1726882365.19784: _execute() done 18699 1726882365.19795: dumping result to json 18699 1726882365.19806: done dumping result, returning 18699 1726882365.19825: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [12673a56-9f93-1ce6-d207-000000000071] 18699 1726882365.19839: sending task result for task 12673a56-9f93-1ce6-d207-000000000071 ok: [managed_node1] => { "changed": false, "ping": "pong" } 18699 1726882365.20054: no more pending results, returning what we have 18699 1726882365.20058: results queue empty 18699 1726882365.20059: checking for any_errors_fatal 18699 1726882365.20126: done checking for any_errors_fatal 18699 1726882365.20127: checking for max_fail_percentage 18699 1726882365.20129: done checking for max_fail_percentage 18699 1726882365.20130: checking to see if all hosts have failed and the running result is not ok 18699 1726882365.20131: done checking to see if all hosts have failed 18699 1726882365.20131: getting the remaining hosts for this loop 18699 1726882365.20133: done getting the remaining hosts for this loop 18699 1726882365.20137: getting the next task for host managed_node1 18699 1726882365.20145: done getting next task for host managed_node1 18699 1726882365.20146: ^ task is: TASK: meta (role_complete) 18699 1726882365.20148: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882365.20156: done sending task result for task 12673a56-9f93-1ce6-d207-000000000071 18699 1726882365.20158: WORKER PROCESS EXITING 18699 1726882365.20164: getting variables 18699 1726882365.20166: in VariableManager get_vars() 18699 1726882365.20210: Calling all_inventory to load vars for managed_node1 18699 1726882365.20213: Calling groups_inventory to load vars for managed_node1 18699 1726882365.20215: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882365.20231: Calling all_plugins_play to load vars for managed_node1 18699 1726882365.20234: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882365.20237: Calling groups_plugins_play to load vars for managed_node1 18699 1726882365.21857: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882365.22803: done with get_vars() 18699 1726882365.22822: done getting variables 18699 1726882365.22939: done queuing things up, now waiting for results queue to drain 18699 1726882365.22941: results queue empty 18699 1726882365.22942: checking for any_errors_fatal 18699 1726882365.22944: done checking for any_errors_fatal 18699 1726882365.22945: checking for max_fail_percentage 18699 1726882365.22946: done checking for max_fail_percentage 18699 1726882365.22947: checking to see if all hosts have failed and the running result is not ok 18699 1726882365.22948: done checking to see if all hosts have failed 18699 1726882365.22948: getting the remaining hosts for this loop 18699 1726882365.22949: done getting the remaining hosts for this loop 18699 1726882365.22952: getting the next task for host managed_node1 18699 1726882365.22956: done getting next task for host managed_node1 18699 1726882365.22957: ^ task is: TASK: meta (flush_handlers) 18699 1726882365.22959: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882365.22961: getting variables 18699 1726882365.22962: in VariableManager get_vars() 18699 1726882365.22974: Calling all_inventory to load vars for managed_node1 18699 1726882365.22977: Calling groups_inventory to load vars for managed_node1 18699 1726882365.22978: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882365.22983: Calling all_plugins_play to load vars for managed_node1 18699 1726882365.22986: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882365.22988: Calling groups_plugins_play to load vars for managed_node1 18699 1726882365.24253: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882365.25352: done with get_vars() 18699 1726882365.25371: done getting variables 18699 1726882365.25410: in VariableManager get_vars() 18699 1726882365.25424: Calling all_inventory to load vars for managed_node1 18699 1726882365.25427: Calling groups_inventory to load vars for managed_node1 18699 1726882365.25429: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882365.25434: Calling all_plugins_play to load vars for managed_node1 18699 1726882365.25436: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882365.25438: Calling groups_plugins_play to load vars for managed_node1 18699 1726882365.26862: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882365.29573: done with get_vars() 18699 1726882365.29663: done queuing things up, now waiting for results queue to drain 18699 1726882365.29665: results queue empty 18699 1726882365.29666: checking for any_errors_fatal 18699 1726882365.29667: done checking for any_errors_fatal 18699 1726882365.29668: checking for max_fail_percentage 18699 1726882365.29669: done checking for max_fail_percentage 18699 1726882365.29670: checking to see if all hosts have failed and the running result is not ok 18699 1726882365.29671: done checking to see if all hosts have failed 18699 1726882365.29671: getting the remaining hosts for this loop 18699 1726882365.29672: done getting the remaining hosts for this loop 18699 1726882365.29675: getting the next task for host managed_node1 18699 1726882365.29679: done getting next task for host managed_node1 18699 1726882365.29680: ^ task is: TASK: meta (flush_handlers) 18699 1726882365.29682: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882365.29685: getting variables 18699 1726882365.29686: in VariableManager get_vars() 18699 1726882365.29702: Calling all_inventory to load vars for managed_node1 18699 1726882365.29704: Calling groups_inventory to load vars for managed_node1 18699 1726882365.29707: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882365.29712: Calling all_plugins_play to load vars for managed_node1 18699 1726882365.29760: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882365.29765: Calling groups_plugins_play to load vars for managed_node1 18699 1726882365.31580: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882365.33696: done with get_vars() 18699 1726882365.33717: done getting variables 18699 1726882365.33775: in VariableManager get_vars() 18699 1726882365.33787: Calling all_inventory to load vars for managed_node1 18699 1726882365.33789: Calling groups_inventory to load vars for managed_node1 18699 1726882365.33791: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882365.33797: Calling all_plugins_play to load vars for managed_node1 18699 1726882365.33799: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882365.33802: Calling groups_plugins_play to load vars for managed_node1 18699 1726882365.35440: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882365.39113: done with get_vars() 18699 1726882365.39177: done queuing things up, now waiting for results queue to drain 18699 1726882365.39180: results queue empty 18699 1726882365.39180: checking for any_errors_fatal 18699 1726882365.39182: done checking for any_errors_fatal 18699 1726882365.39182: checking for max_fail_percentage 18699 1726882365.39183: done checking for max_fail_percentage 18699 1726882365.39190: checking to see if all hosts have failed and the running result is not ok 18699 1726882365.39191: done checking to see if all hosts have failed 18699 1726882365.39196: getting the remaining hosts for this loop 18699 1726882365.39197: done getting the remaining hosts for this loop 18699 1726882365.39200: getting the next task for host managed_node1 18699 1726882365.39203: done getting next task for host managed_node1 18699 1726882365.39204: ^ task is: None 18699 1726882365.39205: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882365.39206: done queuing things up, now waiting for results queue to drain 18699 1726882365.39207: results queue empty 18699 1726882365.39208: checking for any_errors_fatal 18699 1726882365.39209: done checking for any_errors_fatal 18699 1726882365.39209: checking for max_fail_percentage 18699 1726882365.39211: done checking for max_fail_percentage 18699 1726882365.39212: checking to see if all hosts have failed and the running result is not ok 18699 1726882365.39213: done checking to see if all hosts have failed 18699 1726882365.39214: getting the next task for host managed_node1 18699 1726882365.39216: done getting next task for host managed_node1 18699 1726882365.39217: ^ task is: None 18699 1726882365.39218: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882365.39286: in VariableManager get_vars() 18699 1726882365.39309: done with get_vars() 18699 1726882365.39316: in VariableManager get_vars() 18699 1726882365.39470: done with get_vars() 18699 1726882365.39475: variable 'omit' from source: magic vars 18699 1726882365.39584: in VariableManager get_vars() 18699 1726882365.39599: done with get_vars() 18699 1726882365.39626: variable 'omit' from source: magic vars PLAY [Assert device and profile are absent] ************************************ 18699 1726882365.40629: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 18699 1726882365.40809: getting the remaining hosts for this loop 18699 1726882365.40811: done getting the remaining hosts for this loop 18699 1726882365.40813: getting the next task for host managed_node1 18699 1726882365.40816: done getting next task for host managed_node1 18699 1726882365.40818: ^ task is: TASK: Gathering Facts 18699 1726882365.40820: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882365.40822: getting variables 18699 1726882365.40823: in VariableManager get_vars() 18699 1726882365.40890: Calling all_inventory to load vars for managed_node1 18699 1726882365.40897: Calling groups_inventory to load vars for managed_node1 18699 1726882365.40900: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882365.40906: Calling all_plugins_play to load vars for managed_node1 18699 1726882365.40908: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882365.40915: Calling groups_plugins_play to load vars for managed_node1 18699 1726882365.42352: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882365.43816: done with get_vars() 18699 1726882365.43843: done getting variables 18699 1726882365.43884: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:68 Friday 20 September 2024 21:32:45 -0400 (0:00:00.653) 0:00:39.035 ****** 18699 1726882365.43912: entering _queue_task() for managed_node1/gather_facts 18699 1726882365.44846: worker is 1 (out of 1 available) 18699 1726882365.44858: exiting _queue_task() for managed_node1/gather_facts 18699 1726882365.44867: done queuing things up, now waiting for results queue to drain 18699 1726882365.44868: waiting for pending results... 18699 1726882365.45195: running TaskExecutor() for managed_node1/TASK: Gathering Facts 18699 1726882365.45319: in run() - task 12673a56-9f93-1ce6-d207-0000000004e4 18699 1726882365.45348: variable 'ansible_search_path' from source: unknown 18699 1726882365.45388: calling self._execute() 18699 1726882365.45496: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882365.45803: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882365.45807: variable 'omit' from source: magic vars 18699 1726882365.46396: variable 'ansible_distribution_major_version' from source: facts 18699 1726882365.46469: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882365.46525: variable 'omit' from source: magic vars 18699 1726882365.46603: variable 'omit' from source: magic vars 18699 1726882365.46773: variable 'omit' from source: magic vars 18699 1726882365.47010: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18699 1726882365.47099: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18699 1726882365.47196: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18699 1726882365.47222: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882365.47339: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882365.47343: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18699 1726882365.47345: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882365.47390: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882365.47702: Set connection var ansible_connection to ssh 18699 1726882365.47706: Set connection var ansible_pipelining to False 18699 1726882365.47709: Set connection var ansible_shell_executable to /bin/sh 18699 1726882365.47711: Set connection var ansible_timeout to 10 18699 1726882365.47713: Set connection var ansible_shell_type to sh 18699 1726882365.47716: Set connection var ansible_module_compression to ZIP_DEFLATED 18699 1726882365.47731: variable 'ansible_shell_executable' from source: unknown 18699 1726882365.47900: variable 'ansible_connection' from source: unknown 18699 1726882365.47903: variable 'ansible_module_compression' from source: unknown 18699 1726882365.47906: variable 'ansible_shell_type' from source: unknown 18699 1726882365.47908: variable 'ansible_shell_executable' from source: unknown 18699 1726882365.47910: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882365.47913: variable 'ansible_pipelining' from source: unknown 18699 1726882365.47915: variable 'ansible_timeout' from source: unknown 18699 1726882365.47917: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882365.48371: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18699 1726882365.48402: variable 'omit' from source: magic vars 18699 1726882365.48443: starting attempt loop 18699 1726882365.48538: running the handler 18699 1726882365.48547: variable 'ansible_facts' from source: unknown 18699 1726882365.48589: _low_level_execute_command(): starting 18699 1726882365.48603: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18699 1726882365.49511: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882365.49534: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882365.49614: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882365.51264: stdout chunk (state=3): >>>/root <<< 18699 1726882365.51336: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882365.51366: stderr chunk (state=3): >>><<< 18699 1726882365.51369: stdout chunk (state=3): >>><<< 18699 1726882365.51398: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882365.51408: _low_level_execute_command(): starting 18699 1726882365.51415: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882365.513953-20561-227366679432803 `" && echo ansible-tmp-1726882365.513953-20561-227366679432803="` echo /root/.ansible/tmp/ansible-tmp-1726882365.513953-20561-227366679432803 `" ) && sleep 0' 18699 1726882365.51867: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882365.51870: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882365.51873: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882365.51883: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882365.51934: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882365.51942: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882365.51981: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882365.53881: stdout chunk (state=3): >>>ansible-tmp-1726882365.513953-20561-227366679432803=/root/.ansible/tmp/ansible-tmp-1726882365.513953-20561-227366679432803 <<< 18699 1726882365.54016: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882365.54021: stderr chunk (state=3): >>><<< 18699 1726882365.54023: stdout chunk (state=3): >>><<< 18699 1726882365.54041: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882365.513953-20561-227366679432803=/root/.ansible/tmp/ansible-tmp-1726882365.513953-20561-227366679432803 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882365.54206: variable 'ansible_module_compression' from source: unknown 18699 1726882365.54209: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18699f6i6z5dg/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 18699 1726882365.54214: variable 'ansible_facts' from source: unknown 18699 1726882365.54438: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882365.513953-20561-227366679432803/AnsiballZ_setup.py 18699 1726882365.54659: Sending initial data 18699 1726882365.54668: Sent initial data (153 bytes) 18699 1726882365.55556: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882365.55569: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 18699 1726882365.55580: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882365.55639: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882365.55651: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882365.55675: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882365.55761: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882365.57451: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18699 1726882365.57489: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18699 1726882365.57544: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmpgpx5lq2v /root/.ansible/tmp/ansible-tmp-1726882365.513953-20561-227366679432803/AnsiballZ_setup.py <<< 18699 1726882365.57548: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882365.513953-20561-227366679432803/AnsiballZ_setup.py" <<< 18699 1726882365.57581: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmpgpx5lq2v" to remote "/root/.ansible/tmp/ansible-tmp-1726882365.513953-20561-227366679432803/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882365.513953-20561-227366679432803/AnsiballZ_setup.py" <<< 18699 1726882365.61415: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882365.61806: stdout chunk (state=3): >>><<< 18699 1726882365.61811: stderr chunk (state=3): >>><<< 18699 1726882365.61813: done transferring module to remote 18699 1726882365.61816: _low_level_execute_command(): starting 18699 1726882365.61818: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882365.513953-20561-227366679432803/ /root/.ansible/tmp/ansible-tmp-1726882365.513953-20561-227366679432803/AnsiballZ_setup.py && sleep 0' 18699 1726882365.62736: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882365.62751: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882365.62808: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882365.63016: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882365.63073: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882365.63181: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882365.65049: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882365.65114: stdout chunk (state=3): >>><<< 18699 1726882365.65127: stderr chunk (state=3): >>><<< 18699 1726882365.65429: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882365.65432: _low_level_execute_command(): starting 18699 1726882365.65435: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882365.513953-20561-227366679432803/AnsiballZ_setup.py && sleep 0' 18699 1726882365.66709: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882365.66932: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882365.66977: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882366.29978: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-159", "ansible_nodename": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d2d02cced42c36436217cb93f6b8e", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC9sgyYGKGPd0JFIDKIZZNkcX78Ca8OmX4GnOCt150Ftpgzzfir9Dy2HOb7d6QbQheoi9HLkHb66U2LDdt7EnBGKnI12YAuydTDfqITc2L4W9cEeoy/f2rrMlBo6FN3SNQc2voCDsWius2gK2mtTTZZI0R33PguMmqTkwYVzP0hYplwSYh5Atl+XP7/xLRhhowanh9U6x2ahqfnNq5DInqi070bKk0xZ2g12Vg8kIRno8ZQmm+ujUUevRkZysHvnrnN01ZQhqzjo/Awn+Pft6LYleTBn+YU/HlPMWR4PsFcrtT3WRdF5samSvVwWuuOC+0td2zQN4nGpYLK+FmpNG4nDfGZV/xIBBblNRvzrhKgk3lDU5qkeQ/R0godRQGbv4J1kq+3WU2E3upqBYxXWUJLM5FirAxz8tKLmaPh8YZWMKcs3X9F2ySLEcnhe5R5F6LFSNx13zQSt7lGZOIgzhvWllcs4YVxcV1Y4rTJ8jEK2KgWua+bZinJPYUJqKTzO2E=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKk0X8hfHP7BSAAI8BDwrr4175ddN6MsanEqlp3oVMOvThKVXLpFXhvJPbq2IBTd3Wm12dL2vAW7/82zG63KYZk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDVN13dHSxa36Blsqt/Q8OyOA04CC7ZlvrS6zWL4aDyE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "32", "second": "45", "epoch": "1726882365", "epoch_int": "1726882365", "date": "2024-09-20", "time": "21:32:45", "iso8601_micro": "2024-09-21T01:32:45.938764Z", "iso8601": "2024-09-21T01:32:45Z", "iso8601_basic": "20240920T213245938764", "iso8601_basic_short": "20240920T213245", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fips": false, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2942, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb<<< 18699 1726882366.30030: stdout chunk (state=3): >>>": {"real": {"total": 3531, "used": 589, "free": 2942}, "nocache": {"free": 3280, "used": 251}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_uuid": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 799, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794832384, "block_size": 4096, "block_total": 65519099, "block_available": 63914754, "block_used": 1604345, "inode_total": 131070960, "inode_available": 131029047, "inode_used": 41913, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_loadavg": {"1m": 0.47412109375, "5m": 0.3349609375, "15m": 0.1669921875}, "ansible_is_chroot": false, "ansible_pkg_mgr": "dnf", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 52586 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 52586 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_service_mgr": "systemd", "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1030:bff:fea1:4223", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed<<< 18699 1726882366.30063: stdout chunk (state=3): >>>]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::1030:bff:fea1:4223"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1030:bff:fea1:4223"]}, "ansible_lsb": {}, "ansible_iscsi_iqn": "", "ansible_apparmor": {"status": "disabled"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fibre_channel_wwn": [], "ansible_local": {}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 18699 1726882366.32004: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 18699 1726882366.32036: stdout chunk (state=3): >>><<< 18699 1726882366.32040: stderr chunk (state=3): >>><<< 18699 1726882366.32080: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-159", "ansible_nodename": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d2d02cced42c36436217cb93f6b8e", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC9sgyYGKGPd0JFIDKIZZNkcX78Ca8OmX4GnOCt150Ftpgzzfir9Dy2HOb7d6QbQheoi9HLkHb66U2LDdt7EnBGKnI12YAuydTDfqITc2L4W9cEeoy/f2rrMlBo6FN3SNQc2voCDsWius2gK2mtTTZZI0R33PguMmqTkwYVzP0hYplwSYh5Atl+XP7/xLRhhowanh9U6x2ahqfnNq5DInqi070bKk0xZ2g12Vg8kIRno8ZQmm+ujUUevRkZysHvnrnN01ZQhqzjo/Awn+Pft6LYleTBn+YU/HlPMWR4PsFcrtT3WRdF5samSvVwWuuOC+0td2zQN4nGpYLK+FmpNG4nDfGZV/xIBBblNRvzrhKgk3lDU5qkeQ/R0godRQGbv4J1kq+3WU2E3upqBYxXWUJLM5FirAxz8tKLmaPh8YZWMKcs3X9F2ySLEcnhe5R5F6LFSNx13zQSt7lGZOIgzhvWllcs4YVxcV1Y4rTJ8jEK2KgWua+bZinJPYUJqKTzO2E=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKk0X8hfHP7BSAAI8BDwrr4175ddN6MsanEqlp3oVMOvThKVXLpFXhvJPbq2IBTd3Wm12dL2vAW7/82zG63KYZk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDVN13dHSxa36Blsqt/Q8OyOA04CC7ZlvrS6zWL4aDyE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "32", "second": "45", "epoch": "1726882365", "epoch_int": "1726882365", "date": "2024-09-20", "time": "21:32:45", "iso8601_micro": "2024-09-21T01:32:45.938764Z", "iso8601": "2024-09-21T01:32:45Z", "iso8601_basic": "20240920T213245938764", "iso8601_basic_short": "20240920T213245", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fips": false, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2942, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 589, "free": 2942}, "nocache": {"free": 3280, "used": 251}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_uuid": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 799, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794832384, "block_size": 4096, "block_total": 65519099, "block_available": 63914754, "block_used": 1604345, "inode_total": 131070960, "inode_available": 131029047, "inode_used": 41913, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_loadavg": {"1m": 0.47412109375, "5m": 0.3349609375, "15m": 0.1669921875}, "ansible_is_chroot": false, "ansible_pkg_mgr": "dnf", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 52586 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 52586 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_service_mgr": "systemd", "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1030:bff:fea1:4223", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::1030:bff:fea1:4223"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1030:bff:fea1:4223"]}, "ansible_lsb": {}, "ansible_iscsi_iqn": "", "ansible_apparmor": {"status": "disabled"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fibre_channel_wwn": [], "ansible_local": {}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 18699 1726882366.33511: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882365.513953-20561-227366679432803/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18699 1726882366.33515: _low_level_execute_command(): starting 18699 1726882366.33517: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882365.513953-20561-227366679432803/ > /dev/null 2>&1 && sleep 0' 18699 1726882366.34735: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882366.34750: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882366.34766: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882366.34784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18699 1726882366.34805: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 18699 1726882366.34829: stderr chunk (state=3): >>>debug2: match not found <<< 18699 1726882366.34931: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882366.34958: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882366.35130: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882366.36942: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882366.36950: stdout chunk (state=3): >>><<< 18699 1726882366.36959: stderr chunk (state=3): >>><<< 18699 1726882366.37002: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882366.37303: handler run complete 18699 1726882366.37306: variable 'ansible_facts' from source: unknown 18699 1726882366.37586: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882366.38339: variable 'ansible_facts' from source: unknown 18699 1726882366.38500: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882366.38910: attempt loop complete, returning result 18699 1726882366.38953: _execute() done 18699 1726882366.38979: dumping result to json 18699 1726882366.39016: done dumping result, returning 18699 1726882366.39078: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [12673a56-9f93-1ce6-d207-0000000004e4] 18699 1726882366.39110: sending task result for task 12673a56-9f93-1ce6-d207-0000000004e4 18699 1726882366.40300: done sending task result for task 12673a56-9f93-1ce6-d207-0000000004e4 18699 1726882366.40304: WORKER PROCESS EXITING ok: [managed_node1] 18699 1726882366.40930: no more pending results, returning what we have 18699 1726882366.40933: results queue empty 18699 1726882366.40934: checking for any_errors_fatal 18699 1726882366.40936: done checking for any_errors_fatal 18699 1726882366.40937: checking for max_fail_percentage 18699 1726882366.40938: done checking for max_fail_percentage 18699 1726882366.40939: checking to see if all hosts have failed and the running result is not ok 18699 1726882366.40940: done checking to see if all hosts have failed 18699 1726882366.40940: getting the remaining hosts for this loop 18699 1726882366.40942: done getting the remaining hosts for this loop 18699 1726882366.40945: getting the next task for host managed_node1 18699 1726882366.40950: done getting next task for host managed_node1 18699 1726882366.40952: ^ task is: TASK: meta (flush_handlers) 18699 1726882366.40954: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882366.40958: getting variables 18699 1726882366.40959: in VariableManager get_vars() 18699 1726882366.40980: Calling all_inventory to load vars for managed_node1 18699 1726882366.40983: Calling groups_inventory to load vars for managed_node1 18699 1726882366.40986: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882366.41100: Calling all_plugins_play to load vars for managed_node1 18699 1726882366.41104: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882366.41108: Calling groups_plugins_play to load vars for managed_node1 18699 1726882366.44457: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882366.47748: done with get_vars() 18699 1726882366.47778: done getting variables 18699 1726882366.47887: in VariableManager get_vars() 18699 1726882366.47900: Calling all_inventory to load vars for managed_node1 18699 1726882366.47902: Calling groups_inventory to load vars for managed_node1 18699 1726882366.47905: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882366.47910: Calling all_plugins_play to load vars for managed_node1 18699 1726882366.47912: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882366.47915: Calling groups_plugins_play to load vars for managed_node1 18699 1726882366.49176: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882366.51924: done with get_vars() 18699 1726882366.51956: done queuing things up, now waiting for results queue to drain 18699 1726882366.51959: results queue empty 18699 1726882366.51960: checking for any_errors_fatal 18699 1726882366.51970: done checking for any_errors_fatal 18699 1726882366.51972: checking for max_fail_percentage 18699 1726882366.51977: done checking for max_fail_percentage 18699 1726882366.51978: checking to see if all hosts have failed and the running result is not ok 18699 1726882366.51979: done checking to see if all hosts have failed 18699 1726882366.51979: getting the remaining hosts for this loop 18699 1726882366.51980: done getting the remaining hosts for this loop 18699 1726882366.51983: getting the next task for host managed_node1 18699 1726882366.51987: done getting next task for host managed_node1 18699 1726882366.51990: ^ task is: TASK: Include the task 'assert_profile_absent.yml' 18699 1726882366.51991: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882366.51995: getting variables 18699 1726882366.51996: in VariableManager get_vars() 18699 1726882366.52007: Calling all_inventory to load vars for managed_node1 18699 1726882366.52009: Calling groups_inventory to load vars for managed_node1 18699 1726882366.52011: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882366.52017: Calling all_plugins_play to load vars for managed_node1 18699 1726882366.52020: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882366.52023: Calling groups_plugins_play to load vars for managed_node1 18699 1726882366.53428: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882366.55883: done with get_vars() 18699 1726882366.55907: done getting variables TASK [Include the task 'assert_profile_absent.yml'] **************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:71 Friday 20 September 2024 21:32:46 -0400 (0:00:01.120) 0:00:40.156 ****** 18699 1726882366.55997: entering _queue_task() for managed_node1/include_tasks 18699 1726882366.56379: worker is 1 (out of 1 available) 18699 1726882366.56391: exiting _queue_task() for managed_node1/include_tasks 18699 1726882366.56514: done queuing things up, now waiting for results queue to drain 18699 1726882366.56515: waiting for pending results... 18699 1726882366.56711: running TaskExecutor() for managed_node1/TASK: Include the task 'assert_profile_absent.yml' 18699 1726882366.56840: in run() - task 12673a56-9f93-1ce6-d207-000000000074 18699 1726882366.56949: variable 'ansible_search_path' from source: unknown 18699 1726882366.56952: calling self._execute() 18699 1726882366.56998: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882366.57009: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882366.57022: variable 'omit' from source: magic vars 18699 1726882366.57918: variable 'ansible_distribution_major_version' from source: facts 18699 1726882366.57921: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882366.57924: _execute() done 18699 1726882366.57926: dumping result to json 18699 1726882366.57929: done dumping result, returning 18699 1726882366.57931: done running TaskExecutor() for managed_node1/TASK: Include the task 'assert_profile_absent.yml' [12673a56-9f93-1ce6-d207-000000000074] 18699 1726882366.57934: sending task result for task 12673a56-9f93-1ce6-d207-000000000074 18699 1726882366.58299: done sending task result for task 12673a56-9f93-1ce6-d207-000000000074 18699 1726882366.58303: WORKER PROCESS EXITING 18699 1726882366.58331: no more pending results, returning what we have 18699 1726882366.58337: in VariableManager get_vars() 18699 1726882366.58376: Calling all_inventory to load vars for managed_node1 18699 1726882366.58380: Calling groups_inventory to load vars for managed_node1 18699 1726882366.58383: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882366.58400: Calling all_plugins_play to load vars for managed_node1 18699 1726882366.58403: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882366.58407: Calling groups_plugins_play to load vars for managed_node1 18699 1726882366.60644: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882366.62732: done with get_vars() 18699 1726882366.62751: variable 'ansible_search_path' from source: unknown 18699 1726882366.62764: we have included files to process 18699 1726882366.62765: generating all_blocks data 18699 1726882366.62766: done generating all_blocks data 18699 1726882366.62767: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 18699 1726882366.62768: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 18699 1726882366.62770: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 18699 1726882366.63065: in VariableManager get_vars() 18699 1726882366.63081: done with get_vars() 18699 1726882366.63351: done processing included file 18699 1726882366.63353: iterating over new_blocks loaded from include file 18699 1726882366.63354: in VariableManager get_vars() 18699 1726882366.63366: done with get_vars() 18699 1726882366.63367: filtering new block on tags 18699 1726882366.63383: done filtering new block on tags 18699 1726882366.63386: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml for managed_node1 18699 1726882366.63390: extending task lists for all hosts with included blocks 18699 1726882366.63440: done extending task lists 18699 1726882366.63442: done processing included files 18699 1726882366.63442: results queue empty 18699 1726882366.63443: checking for any_errors_fatal 18699 1726882366.63444: done checking for any_errors_fatal 18699 1726882366.63445: checking for max_fail_percentage 18699 1726882366.63446: done checking for max_fail_percentage 18699 1726882366.63447: checking to see if all hosts have failed and the running result is not ok 18699 1726882366.63448: done checking to see if all hosts have failed 18699 1726882366.63448: getting the remaining hosts for this loop 18699 1726882366.63450: done getting the remaining hosts for this loop 18699 1726882366.63566: getting the next task for host managed_node1 18699 1726882366.63571: done getting next task for host managed_node1 18699 1726882366.63573: ^ task is: TASK: Include the task 'get_profile_stat.yml' 18699 1726882366.63576: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882366.63578: getting variables 18699 1726882366.63579: in VariableManager get_vars() 18699 1726882366.63588: Calling all_inventory to load vars for managed_node1 18699 1726882366.63590: Calling groups_inventory to load vars for managed_node1 18699 1726882366.63644: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882366.63653: Calling all_plugins_play to load vars for managed_node1 18699 1726882366.63656: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882366.63658: Calling groups_plugins_play to load vars for managed_node1 18699 1726882366.65108: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882366.67806: done with get_vars() 18699 1726882366.67835: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:3 Friday 20 September 2024 21:32:46 -0400 (0:00:00.119) 0:00:40.275 ****** 18699 1726882366.67927: entering _queue_task() for managed_node1/include_tasks 18699 1726882366.68284: worker is 1 (out of 1 available) 18699 1726882366.68300: exiting _queue_task() for managed_node1/include_tasks 18699 1726882366.68312: done queuing things up, now waiting for results queue to drain 18699 1726882366.68313: waiting for pending results... 18699 1726882366.68794: running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' 18699 1726882366.68999: in run() - task 12673a56-9f93-1ce6-d207-0000000004f5 18699 1726882366.69003: variable 'ansible_search_path' from source: unknown 18699 1726882366.69006: variable 'ansible_search_path' from source: unknown 18699 1726882366.69152: calling self._execute() 18699 1726882366.69241: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882366.69245: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882366.69257: variable 'omit' from source: magic vars 18699 1726882366.70371: variable 'ansible_distribution_major_version' from source: facts 18699 1726882366.70383: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882366.70390: _execute() done 18699 1726882366.70396: dumping result to json 18699 1726882366.70799: done dumping result, returning 18699 1726882366.70804: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' [12673a56-9f93-1ce6-d207-0000000004f5] 18699 1726882366.70807: sending task result for task 12673a56-9f93-1ce6-d207-0000000004f5 18699 1726882366.70873: done sending task result for task 12673a56-9f93-1ce6-d207-0000000004f5 18699 1726882366.70875: WORKER PROCESS EXITING 18699 1726882366.70907: no more pending results, returning what we have 18699 1726882366.70912: in VariableManager get_vars() 18699 1726882366.70946: Calling all_inventory to load vars for managed_node1 18699 1726882366.70949: Calling groups_inventory to load vars for managed_node1 18699 1726882366.70952: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882366.70966: Calling all_plugins_play to load vars for managed_node1 18699 1726882366.70968: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882366.70971: Calling groups_plugins_play to load vars for managed_node1 18699 1726882366.74639: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882366.76589: done with get_vars() 18699 1726882366.76626: variable 'ansible_search_path' from source: unknown 18699 1726882366.76628: variable 'ansible_search_path' from source: unknown 18699 1726882366.76673: we have included files to process 18699 1726882366.76675: generating all_blocks data 18699 1726882366.76677: done generating all_blocks data 18699 1726882366.76678: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 18699 1726882366.76680: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 18699 1726882366.76682: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 18699 1726882366.77905: done processing included file 18699 1726882366.77908: iterating over new_blocks loaded from include file 18699 1726882366.77913: in VariableManager get_vars() 18699 1726882366.77928: done with get_vars() 18699 1726882366.77930: filtering new block on tags 18699 1726882366.77959: done filtering new block on tags 18699 1726882366.77963: in VariableManager get_vars() 18699 1726882366.77976: done with get_vars() 18699 1726882366.77978: filtering new block on tags 18699 1726882366.78003: done filtering new block on tags 18699 1726882366.78005: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node1 18699 1726882366.78011: extending task lists for all hosts with included blocks 18699 1726882366.78137: done extending task lists 18699 1726882366.78138: done processing included files 18699 1726882366.78139: results queue empty 18699 1726882366.78140: checking for any_errors_fatal 18699 1726882366.78143: done checking for any_errors_fatal 18699 1726882366.78144: checking for max_fail_percentage 18699 1726882366.78145: done checking for max_fail_percentage 18699 1726882366.78146: checking to see if all hosts have failed and the running result is not ok 18699 1726882366.78147: done checking to see if all hosts have failed 18699 1726882366.78147: getting the remaining hosts for this loop 18699 1726882366.78148: done getting the remaining hosts for this loop 18699 1726882366.78151: getting the next task for host managed_node1 18699 1726882366.78155: done getting next task for host managed_node1 18699 1726882366.78157: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 18699 1726882366.78160: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882366.78162: getting variables 18699 1726882366.78163: in VariableManager get_vars() 18699 1726882366.78375: Calling all_inventory to load vars for managed_node1 18699 1726882366.78378: Calling groups_inventory to load vars for managed_node1 18699 1726882366.78380: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882366.78386: Calling all_plugins_play to load vars for managed_node1 18699 1726882366.78396: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882366.78400: Calling groups_plugins_play to load vars for managed_node1 18699 1726882366.80635: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882366.83421: done with get_vars() 18699 1726882366.83445: done getting variables 18699 1726882366.83498: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 21:32:46 -0400 (0:00:00.156) 0:00:40.431 ****** 18699 1726882366.83529: entering _queue_task() for managed_node1/set_fact 18699 1726882366.83880: worker is 1 (out of 1 available) 18699 1726882366.83891: exiting _queue_task() for managed_node1/set_fact 18699 1726882366.83913: done queuing things up, now waiting for results queue to drain 18699 1726882366.83915: waiting for pending results... 18699 1726882366.84270: running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag 18699 1726882366.84294: in run() - task 12673a56-9f93-1ce6-d207-000000000502 18699 1726882366.84319: variable 'ansible_search_path' from source: unknown 18699 1726882366.84326: variable 'ansible_search_path' from source: unknown 18699 1726882366.84368: calling self._execute() 18699 1726882366.84461: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882366.84476: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882366.84491: variable 'omit' from source: magic vars 18699 1726882366.84899: variable 'ansible_distribution_major_version' from source: facts 18699 1726882366.84920: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882366.84934: variable 'omit' from source: magic vars 18699 1726882366.84980: variable 'omit' from source: magic vars 18699 1726882366.85099: variable 'omit' from source: magic vars 18699 1726882366.85102: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18699 1726882366.85105: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18699 1726882366.85133: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18699 1726882366.85155: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882366.85173: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882366.85501: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18699 1726882366.85504: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882366.85506: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882366.85514: Set connection var ansible_connection to ssh 18699 1726882366.85525: Set connection var ansible_pipelining to False 18699 1726882366.85534: Set connection var ansible_shell_executable to /bin/sh 18699 1726882366.85543: Set connection var ansible_timeout to 10 18699 1726882366.85550: Set connection var ansible_shell_type to sh 18699 1726882366.85558: Set connection var ansible_module_compression to ZIP_DEFLATED 18699 1726882366.85589: variable 'ansible_shell_executable' from source: unknown 18699 1726882366.85799: variable 'ansible_connection' from source: unknown 18699 1726882366.85803: variable 'ansible_module_compression' from source: unknown 18699 1726882366.85805: variable 'ansible_shell_type' from source: unknown 18699 1726882366.85808: variable 'ansible_shell_executable' from source: unknown 18699 1726882366.85810: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882366.85816: variable 'ansible_pipelining' from source: unknown 18699 1726882366.85820: variable 'ansible_timeout' from source: unknown 18699 1726882366.85823: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882366.85978: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18699 1726882366.86099: variable 'omit' from source: magic vars 18699 1726882366.86102: starting attempt loop 18699 1726882366.86105: running the handler 18699 1726882366.86107: handler run complete 18699 1726882366.86109: attempt loop complete, returning result 18699 1726882366.86111: _execute() done 18699 1726882366.86113: dumping result to json 18699 1726882366.86115: done dumping result, returning 18699 1726882366.86163: done running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag [12673a56-9f93-1ce6-d207-000000000502] 18699 1726882366.86172: sending task result for task 12673a56-9f93-1ce6-d207-000000000502 ok: [managed_node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 18699 1726882366.86410: no more pending results, returning what we have 18699 1726882366.86417: results queue empty 18699 1726882366.86418: checking for any_errors_fatal 18699 1726882366.86421: done checking for any_errors_fatal 18699 1726882366.86422: checking for max_fail_percentage 18699 1726882366.86423: done checking for max_fail_percentage 18699 1726882366.86424: checking to see if all hosts have failed and the running result is not ok 18699 1726882366.86424: done checking to see if all hosts have failed 18699 1726882366.86425: getting the remaining hosts for this loop 18699 1726882366.86427: done getting the remaining hosts for this loop 18699 1726882366.86430: getting the next task for host managed_node1 18699 1726882366.86436: done getting next task for host managed_node1 18699 1726882366.86439: ^ task is: TASK: Stat profile file 18699 1726882366.86443: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882366.86448: getting variables 18699 1726882366.86450: in VariableManager get_vars() 18699 1726882366.86477: Calling all_inventory to load vars for managed_node1 18699 1726882366.86480: Calling groups_inventory to load vars for managed_node1 18699 1726882366.86483: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882366.86499: Calling all_plugins_play to load vars for managed_node1 18699 1726882366.86503: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882366.86507: Calling groups_plugins_play to load vars for managed_node1 18699 1726882366.87081: done sending task result for task 12673a56-9f93-1ce6-d207-000000000502 18699 1726882366.87085: WORKER PROCESS EXITING 18699 1726882366.88424: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882366.90574: done with get_vars() 18699 1726882366.90731: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 21:32:46 -0400 (0:00:00.073) 0:00:40.504 ****** 18699 1726882366.90866: entering _queue_task() for managed_node1/stat 18699 1726882366.91572: worker is 1 (out of 1 available) 18699 1726882366.91584: exiting _queue_task() for managed_node1/stat 18699 1726882366.91597: done queuing things up, now waiting for results queue to drain 18699 1726882366.91598: waiting for pending results... 18699 1726882366.91970: running TaskExecutor() for managed_node1/TASK: Stat profile file 18699 1726882366.92173: in run() - task 12673a56-9f93-1ce6-d207-000000000503 18699 1726882366.92307: variable 'ansible_search_path' from source: unknown 18699 1726882366.92311: variable 'ansible_search_path' from source: unknown 18699 1726882366.92340: calling self._execute() 18699 1726882366.92536: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882366.92540: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882366.92552: variable 'omit' from source: magic vars 18699 1726882366.93332: variable 'ansible_distribution_major_version' from source: facts 18699 1726882366.93343: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882366.93350: variable 'omit' from source: magic vars 18699 1726882366.93517: variable 'omit' from source: magic vars 18699 1726882366.93800: variable 'profile' from source: include params 18699 1726882366.93803: variable 'interface' from source: set_fact 18699 1726882366.93806: variable 'interface' from source: set_fact 18699 1726882366.93943: variable 'omit' from source: magic vars 18699 1726882366.93983: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18699 1726882366.94022: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18699 1726882366.94123: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18699 1726882366.94252: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882366.94264: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882366.94296: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18699 1726882366.94302: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882366.94305: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882366.94554: Set connection var ansible_connection to ssh 18699 1726882366.94557: Set connection var ansible_pipelining to False 18699 1726882366.94559: Set connection var ansible_shell_executable to /bin/sh 18699 1726882366.94561: Set connection var ansible_timeout to 10 18699 1726882366.94563: Set connection var ansible_shell_type to sh 18699 1726882366.94565: Set connection var ansible_module_compression to ZIP_DEFLATED 18699 1726882366.94687: variable 'ansible_shell_executable' from source: unknown 18699 1726882366.94690: variable 'ansible_connection' from source: unknown 18699 1726882366.94696: variable 'ansible_module_compression' from source: unknown 18699 1726882366.94698: variable 'ansible_shell_type' from source: unknown 18699 1726882366.94701: variable 'ansible_shell_executable' from source: unknown 18699 1726882366.94773: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882366.94776: variable 'ansible_pipelining' from source: unknown 18699 1726882366.94779: variable 'ansible_timeout' from source: unknown 18699 1726882366.94781: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882366.95144: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18699 1726882366.95155: variable 'omit' from source: magic vars 18699 1726882366.95161: starting attempt loop 18699 1726882366.95163: running the handler 18699 1726882366.95179: _low_level_execute_command(): starting 18699 1726882366.95187: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18699 1726882366.96606: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882366.96907: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882366.96912: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882366.97006: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882366.98678: stdout chunk (state=3): >>>/root <<< 18699 1726882366.98872: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882366.98876: stdout chunk (state=3): >>><<< 18699 1726882366.98881: stderr chunk (state=3): >>><<< 18699 1726882366.98906: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882366.98919: _low_level_execute_command(): starting 18699 1726882366.98927: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882366.9890568-20631-246872310245642 `" && echo ansible-tmp-1726882366.9890568-20631-246872310245642="` echo /root/.ansible/tmp/ansible-tmp-1726882366.9890568-20631-246872310245642 `" ) && sleep 0' 18699 1726882367.00017: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882367.00216: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882367.00224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18699 1726882367.00234: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882367.00338: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 18699 1726882367.00341: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882367.00346: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 18699 1726882367.00349: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882367.00400: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882367.00403: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882367.00433: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882367.00498: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882367.02800: stdout chunk (state=3): >>>ansible-tmp-1726882366.9890568-20631-246872310245642=/root/.ansible/tmp/ansible-tmp-1726882366.9890568-20631-246872310245642 <<< 18699 1726882367.02803: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882367.02805: stdout chunk (state=3): >>><<< 18699 1726882367.02808: stderr chunk (state=3): >>><<< 18699 1726882367.02810: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882366.9890568-20631-246872310245642=/root/.ansible/tmp/ansible-tmp-1726882366.9890568-20631-246872310245642 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882367.02813: variable 'ansible_module_compression' from source: unknown 18699 1726882367.02815: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18699f6i6z5dg/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 18699 1726882367.02816: variable 'ansible_facts' from source: unknown 18699 1726882367.03072: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882366.9890568-20631-246872310245642/AnsiballZ_stat.py 18699 1726882367.03375: Sending initial data 18699 1726882367.03384: Sent initial data (153 bytes) 18699 1726882367.03874: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882367.03888: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882367.03998: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882367.04011: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882367.04087: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882367.05611: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18699 1726882367.05663: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18699 1726882367.05899: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmp154m_n0t /root/.ansible/tmp/ansible-tmp-1726882366.9890568-20631-246872310245642/AnsiballZ_stat.py <<< 18699 1726882367.05902: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882366.9890568-20631-246872310245642/AnsiballZ_stat.py" <<< 18699 1726882367.05909: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmp154m_n0t" to remote "/root/.ansible/tmp/ansible-tmp-1726882366.9890568-20631-246872310245642/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882366.9890568-20631-246872310245642/AnsiballZ_stat.py" <<< 18699 1726882367.06934: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882367.07013: stderr chunk (state=3): >>><<< 18699 1726882367.07022: stdout chunk (state=3): >>><<< 18699 1726882367.07051: done transferring module to remote 18699 1726882367.07071: _low_level_execute_command(): starting 18699 1726882367.07159: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882366.9890568-20631-246872310245642/ /root/.ansible/tmp/ansible-tmp-1726882366.9890568-20631-246872310245642/AnsiballZ_stat.py && sleep 0' 18699 1726882367.07647: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882367.07650: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882367.07699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882367.07703: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18699 1726882367.07706: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 18699 1726882367.07709: stderr chunk (state=3): >>>debug2: match not found <<< 18699 1726882367.07711: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882367.07773: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882367.07823: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882367.07870: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882367.09929: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882367.09933: stdout chunk (state=3): >>><<< 18699 1726882367.09936: stderr chunk (state=3): >>><<< 18699 1726882367.09939: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882367.09941: _low_level_execute_command(): starting 18699 1726882367.09943: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882366.9890568-20631-246872310245642/AnsiballZ_stat.py && sleep 0' 18699 1726882367.10565: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882367.10608: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882367.10612: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882367.10614: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18699 1726882367.10617: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 18699 1726882367.10619: stderr chunk (state=3): >>>debug2: match not found <<< 18699 1726882367.10629: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882367.10666: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18699 1726882367.10669: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 18699 1726882367.10672: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 18699 1726882367.10714: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882367.10719: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882367.10721: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18699 1726882367.10723: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 18699 1726882367.10770: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882367.10809: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882367.10882: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882367.25723: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-lsr27", "follow": false, "checksum_algorithm": "sha1"}}} <<< 18699 1726882367.26954: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 18699 1726882367.26998: stderr chunk (state=3): >>><<< 18699 1726882367.27001: stdout chunk (state=3): >>><<< 18699 1726882367.27016: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-lsr27", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 18699 1726882367.27038: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-lsr27', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882366.9890568-20631-246872310245642/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18699 1726882367.27046: _low_level_execute_command(): starting 18699 1726882367.27051: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882366.9890568-20631-246872310245642/ > /dev/null 2>&1 && sleep 0' 18699 1726882367.27583: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882367.27587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882367.27590: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 18699 1726882367.27592: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 18699 1726882367.27607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882367.27629: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882367.27632: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882367.27683: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882367.29538: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882367.29542: stdout chunk (state=3): >>><<< 18699 1726882367.29544: stderr chunk (state=3): >>><<< 18699 1726882367.29699: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882367.29703: handler run complete 18699 1726882367.29706: attempt loop complete, returning result 18699 1726882367.29708: _execute() done 18699 1726882367.29710: dumping result to json 18699 1726882367.29712: done dumping result, returning 18699 1726882367.29714: done running TaskExecutor() for managed_node1/TASK: Stat profile file [12673a56-9f93-1ce6-d207-000000000503] 18699 1726882367.29716: sending task result for task 12673a56-9f93-1ce6-d207-000000000503 18699 1726882367.29805: done sending task result for task 12673a56-9f93-1ce6-d207-000000000503 ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 18699 1726882367.29861: no more pending results, returning what we have 18699 1726882367.29865: results queue empty 18699 1726882367.29866: checking for any_errors_fatal 18699 1726882367.29873: done checking for any_errors_fatal 18699 1726882367.29874: checking for max_fail_percentage 18699 1726882367.29876: done checking for max_fail_percentage 18699 1726882367.29877: checking to see if all hosts have failed and the running result is not ok 18699 1726882367.29877: done checking to see if all hosts have failed 18699 1726882367.29878: getting the remaining hosts for this loop 18699 1726882367.29880: done getting the remaining hosts for this loop 18699 1726882367.29884: getting the next task for host managed_node1 18699 1726882367.29891: done getting next task for host managed_node1 18699 1726882367.29896: ^ task is: TASK: Set NM profile exist flag based on the profile files 18699 1726882367.29900: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882367.30020: getting variables 18699 1726882367.30023: in VariableManager get_vars() 18699 1726882367.30057: Calling all_inventory to load vars for managed_node1 18699 1726882367.30060: Calling groups_inventory to load vars for managed_node1 18699 1726882367.30064: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882367.30077: Calling all_plugins_play to load vars for managed_node1 18699 1726882367.30081: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882367.30085: Calling groups_plugins_play to load vars for managed_node1 18699 1726882367.30609: WORKER PROCESS EXITING 18699 1726882367.31046: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882367.32236: done with get_vars() 18699 1726882367.32256: done getting variables 18699 1726882367.32309: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 21:32:47 -0400 (0:00:00.414) 0:00:40.919 ****** 18699 1726882367.32345: entering _queue_task() for managed_node1/set_fact 18699 1726882367.32637: worker is 1 (out of 1 available) 18699 1726882367.32649: exiting _queue_task() for managed_node1/set_fact 18699 1726882367.32662: done queuing things up, now waiting for results queue to drain 18699 1726882367.32663: waiting for pending results... 18699 1726882367.32938: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files 18699 1726882367.33095: in run() - task 12673a56-9f93-1ce6-d207-000000000504 18699 1726882367.33106: variable 'ansible_search_path' from source: unknown 18699 1726882367.33118: variable 'ansible_search_path' from source: unknown 18699 1726882367.33200: calling self._execute() 18699 1726882367.33258: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882367.33264: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882367.33284: variable 'omit' from source: magic vars 18699 1726882367.33821: variable 'ansible_distribution_major_version' from source: facts 18699 1726882367.33832: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882367.34005: variable 'profile_stat' from source: set_fact 18699 1726882367.34026: Evaluated conditional (profile_stat.stat.exists): False 18699 1726882367.34033: when evaluation is False, skipping this task 18699 1726882367.34036: _execute() done 18699 1726882367.34039: dumping result to json 18699 1726882367.34041: done dumping result, returning 18699 1726882367.34044: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files [12673a56-9f93-1ce6-d207-000000000504] 18699 1726882367.34049: sending task result for task 12673a56-9f93-1ce6-d207-000000000504 18699 1726882367.34282: done sending task result for task 12673a56-9f93-1ce6-d207-000000000504 18699 1726882367.34286: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 18699 1726882367.34357: no more pending results, returning what we have 18699 1726882367.34361: results queue empty 18699 1726882367.34362: checking for any_errors_fatal 18699 1726882367.34370: done checking for any_errors_fatal 18699 1726882367.34371: checking for max_fail_percentage 18699 1726882367.34373: done checking for max_fail_percentage 18699 1726882367.34374: checking to see if all hosts have failed and the running result is not ok 18699 1726882367.34375: done checking to see if all hosts have failed 18699 1726882367.34376: getting the remaining hosts for this loop 18699 1726882367.34379: done getting the remaining hosts for this loop 18699 1726882367.34382: getting the next task for host managed_node1 18699 1726882367.34390: done getting next task for host managed_node1 18699 1726882367.34509: ^ task is: TASK: Get NM profile info 18699 1726882367.34521: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882367.34525: getting variables 18699 1726882367.34526: in VariableManager get_vars() 18699 1726882367.34557: Calling all_inventory to load vars for managed_node1 18699 1726882367.34560: Calling groups_inventory to load vars for managed_node1 18699 1726882367.34563: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882367.34577: Calling all_plugins_play to load vars for managed_node1 18699 1726882367.34581: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882367.34585: Calling groups_plugins_play to load vars for managed_node1 18699 1726882367.38965: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882367.40202: done with get_vars() 18699 1726882367.40222: done getting variables 18699 1726882367.40288: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 21:32:47 -0400 (0:00:00.079) 0:00:40.999 ****** 18699 1726882367.40315: entering _queue_task() for managed_node1/shell 18699 1726882367.40317: Creating lock for shell 18699 1726882367.40667: worker is 1 (out of 1 available) 18699 1726882367.40682: exiting _queue_task() for managed_node1/shell 18699 1726882367.40699: done queuing things up, now waiting for results queue to drain 18699 1726882367.40701: waiting for pending results... 18699 1726882367.40925: running TaskExecutor() for managed_node1/TASK: Get NM profile info 18699 1726882367.41042: in run() - task 12673a56-9f93-1ce6-d207-000000000505 18699 1726882367.41065: variable 'ansible_search_path' from source: unknown 18699 1726882367.41069: variable 'ansible_search_path' from source: unknown 18699 1726882367.41120: calling self._execute() 18699 1726882367.41192: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882367.41216: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882367.41221: variable 'omit' from source: magic vars 18699 1726882367.41566: variable 'ansible_distribution_major_version' from source: facts 18699 1726882367.41575: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882367.41581: variable 'omit' from source: magic vars 18699 1726882367.41632: variable 'omit' from source: magic vars 18699 1726882367.41717: variable 'profile' from source: include params 18699 1726882367.41721: variable 'interface' from source: set_fact 18699 1726882367.41781: variable 'interface' from source: set_fact 18699 1726882367.41800: variable 'omit' from source: magic vars 18699 1726882367.41832: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18699 1726882367.41892: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18699 1726882367.41913: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18699 1726882367.41917: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882367.41929: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882367.41960: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18699 1726882367.41964: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882367.41966: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882367.42039: Set connection var ansible_connection to ssh 18699 1726882367.42045: Set connection var ansible_pipelining to False 18699 1726882367.42051: Set connection var ansible_shell_executable to /bin/sh 18699 1726882367.42056: Set connection var ansible_timeout to 10 18699 1726882367.42058: Set connection var ansible_shell_type to sh 18699 1726882367.42065: Set connection var ansible_module_compression to ZIP_DEFLATED 18699 1726882367.42086: variable 'ansible_shell_executable' from source: unknown 18699 1726882367.42090: variable 'ansible_connection' from source: unknown 18699 1726882367.42095: variable 'ansible_module_compression' from source: unknown 18699 1726882367.42098: variable 'ansible_shell_type' from source: unknown 18699 1726882367.42100: variable 'ansible_shell_executable' from source: unknown 18699 1726882367.42102: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882367.42104: variable 'ansible_pipelining' from source: unknown 18699 1726882367.42107: variable 'ansible_timeout' from source: unknown 18699 1726882367.42116: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882367.42213: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18699 1726882367.42223: variable 'omit' from source: magic vars 18699 1726882367.42229: starting attempt loop 18699 1726882367.42232: running the handler 18699 1726882367.42240: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18699 1726882367.42254: _low_level_execute_command(): starting 18699 1726882367.42261: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18699 1726882367.42995: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882367.43000: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882367.43035: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882367.44669: stdout chunk (state=3): >>>/root <<< 18699 1726882367.44776: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882367.44860: stderr chunk (state=3): >>><<< 18699 1726882367.44864: stdout chunk (state=3): >>><<< 18699 1726882367.44867: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882367.44870: _low_level_execute_command(): starting 18699 1726882367.44873: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882367.4481828-20657-189351619301279 `" && echo ansible-tmp-1726882367.4481828-20657-189351619301279="` echo /root/.ansible/tmp/ansible-tmp-1726882367.4481828-20657-189351619301279 `" ) && sleep 0' 18699 1726882367.45287: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18699 1726882367.45301: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 18699 1726882367.45306: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 18699 1726882367.45313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882367.45359: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882367.45364: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882367.45431: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882367.47599: stdout chunk (state=3): >>>ansible-tmp-1726882367.4481828-20657-189351619301279=/root/.ansible/tmp/ansible-tmp-1726882367.4481828-20657-189351619301279 <<< 18699 1726882367.47602: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882367.47605: stdout chunk (state=3): >>><<< 18699 1726882367.47607: stderr chunk (state=3): >>><<< 18699 1726882367.47610: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882367.4481828-20657-189351619301279=/root/.ansible/tmp/ansible-tmp-1726882367.4481828-20657-189351619301279 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882367.47622: variable 'ansible_module_compression' from source: unknown 18699 1726882367.47685: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18699f6i6z5dg/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 18699 1726882367.47767: variable 'ansible_facts' from source: unknown 18699 1726882367.47842: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882367.4481828-20657-189351619301279/AnsiballZ_command.py 18699 1726882367.47939: Sending initial data 18699 1726882367.47943: Sent initial data (156 bytes) 18699 1726882367.48422: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882367.48425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 18699 1726882367.48428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882367.48430: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882367.48432: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 18699 1726882367.48437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882367.48498: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882367.48557: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882367.50065: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 18699 1726882367.50078: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18699 1726882367.50109: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18699 1726882367.50157: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmpkyzcpex3 /root/.ansible/tmp/ansible-tmp-1726882367.4481828-20657-189351619301279/AnsiballZ_command.py <<< 18699 1726882367.50159: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882367.4481828-20657-189351619301279/AnsiballZ_command.py" <<< 18699 1726882367.50208: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmpkyzcpex3" to remote "/root/.ansible/tmp/ansible-tmp-1726882367.4481828-20657-189351619301279/AnsiballZ_command.py" <<< 18699 1726882367.50210: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882367.4481828-20657-189351619301279/AnsiballZ_command.py" <<< 18699 1726882367.50741: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882367.50781: stderr chunk (state=3): >>><<< 18699 1726882367.50784: stdout chunk (state=3): >>><<< 18699 1726882367.50808: done transferring module to remote 18699 1726882367.50816: _low_level_execute_command(): starting 18699 1726882367.50821: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882367.4481828-20657-189351619301279/ /root/.ansible/tmp/ansible-tmp-1726882367.4481828-20657-189351619301279/AnsiballZ_command.py && sleep 0' 18699 1726882367.51286: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882367.51302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882367.51318: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882367.51439: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882367.51442: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882367.51452: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882367.53177: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882367.53203: stderr chunk (state=3): >>><<< 18699 1726882367.53206: stdout chunk (state=3): >>><<< 18699 1726882367.53220: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882367.53223: _low_level_execute_command(): starting 18699 1726882367.53228: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882367.4481828-20657-189351619301279/AnsiballZ_command.py && sleep 0' 18699 1726882367.53655: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882367.53658: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882367.53661: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 18699 1726882367.53663: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18699 1726882367.53665: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882367.53715: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882367.53722: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882367.53764: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882367.70355: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep lsr27 | grep /etc", "start": "2024-09-20 21:32:47.686618", "end": "2024-09-20 21:32:47.702374", "delta": "0:00:00.015756", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep lsr27 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 18699 1726882367.71799: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.9.159 closed. <<< 18699 1726882367.71831: stderr chunk (state=3): >>><<< 18699 1726882367.71834: stdout chunk (state=3): >>><<< 18699 1726882367.71854: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep lsr27 | grep /etc", "start": "2024-09-20 21:32:47.686618", "end": "2024-09-20 21:32:47.702374", "delta": "0:00:00.015756", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep lsr27 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.9.159 closed. 18699 1726882367.71883: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep lsr27 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882367.4481828-20657-189351619301279/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18699 1726882367.71891: _low_level_execute_command(): starting 18699 1726882367.71898: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882367.4481828-20657-189351619301279/ > /dev/null 2>&1 && sleep 0' 18699 1726882367.72364: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882367.72367: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 18699 1726882367.72369: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882367.72375: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882367.72377: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882367.72427: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882367.72431: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882367.72477: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882367.74267: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882367.74290: stderr chunk (state=3): >>><<< 18699 1726882367.74294: stdout chunk (state=3): >>><<< 18699 1726882367.74309: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882367.74319: handler run complete 18699 1726882367.74332: Evaluated conditional (False): False 18699 1726882367.74341: attempt loop complete, returning result 18699 1726882367.74345: _execute() done 18699 1726882367.74347: dumping result to json 18699 1726882367.74349: done dumping result, returning 18699 1726882367.74360: done running TaskExecutor() for managed_node1/TASK: Get NM profile info [12673a56-9f93-1ce6-d207-000000000505] 18699 1726882367.74362: sending task result for task 12673a56-9f93-1ce6-d207-000000000505 18699 1726882367.74455: done sending task result for task 12673a56-9f93-1ce6-d207-000000000505 18699 1726882367.74459: WORKER PROCESS EXITING fatal: [managed_node1]: FAILED! => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep lsr27 | grep /etc", "delta": "0:00:00.015756", "end": "2024-09-20 21:32:47.702374", "rc": 1, "start": "2024-09-20 21:32:47.686618" } MSG: non-zero return code ...ignoring 18699 1726882367.74528: no more pending results, returning what we have 18699 1726882367.74531: results queue empty 18699 1726882367.74532: checking for any_errors_fatal 18699 1726882367.74540: done checking for any_errors_fatal 18699 1726882367.74540: checking for max_fail_percentage 18699 1726882367.74542: done checking for max_fail_percentage 18699 1726882367.74542: checking to see if all hosts have failed and the running result is not ok 18699 1726882367.74543: done checking to see if all hosts have failed 18699 1726882367.74544: getting the remaining hosts for this loop 18699 1726882367.74545: done getting the remaining hosts for this loop 18699 1726882367.74549: getting the next task for host managed_node1 18699 1726882367.74555: done getting next task for host managed_node1 18699 1726882367.74558: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 18699 1726882367.74561: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882367.74565: getting variables 18699 1726882367.74567: in VariableManager get_vars() 18699 1726882367.74604: Calling all_inventory to load vars for managed_node1 18699 1726882367.74607: Calling groups_inventory to load vars for managed_node1 18699 1726882367.74611: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882367.74622: Calling all_plugins_play to load vars for managed_node1 18699 1726882367.74625: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882367.74627: Calling groups_plugins_play to load vars for managed_node1 18699 1726882367.75483: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882367.76397: done with get_vars() 18699 1726882367.76413: done getting variables 18699 1726882367.76460: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 21:32:47 -0400 (0:00:00.361) 0:00:41.360 ****** 18699 1726882367.76483: entering _queue_task() for managed_node1/set_fact 18699 1726882367.76720: worker is 1 (out of 1 available) 18699 1726882367.76734: exiting _queue_task() for managed_node1/set_fact 18699 1726882367.76744: done queuing things up, now waiting for results queue to drain 18699 1726882367.76745: waiting for pending results... 18699 1726882367.76923: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 18699 1726882367.77013: in run() - task 12673a56-9f93-1ce6-d207-000000000506 18699 1726882367.77024: variable 'ansible_search_path' from source: unknown 18699 1726882367.77028: variable 'ansible_search_path' from source: unknown 18699 1726882367.77055: calling self._execute() 18699 1726882367.77130: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882367.77134: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882367.77141: variable 'omit' from source: magic vars 18699 1726882367.77432: variable 'ansible_distribution_major_version' from source: facts 18699 1726882367.77441: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882367.77532: variable 'nm_profile_exists' from source: set_fact 18699 1726882367.77544: Evaluated conditional (nm_profile_exists.rc == 0): False 18699 1726882367.77547: when evaluation is False, skipping this task 18699 1726882367.77550: _execute() done 18699 1726882367.77553: dumping result to json 18699 1726882367.77556: done dumping result, returning 18699 1726882367.77563: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [12673a56-9f93-1ce6-d207-000000000506] 18699 1726882367.77567: sending task result for task 12673a56-9f93-1ce6-d207-000000000506 18699 1726882367.77667: done sending task result for task 12673a56-9f93-1ce6-d207-000000000506 18699 1726882367.77670: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "nm_profile_exists.rc == 0", "skip_reason": "Conditional result was False" } 18699 1726882367.77715: no more pending results, returning what we have 18699 1726882367.77718: results queue empty 18699 1726882367.77719: checking for any_errors_fatal 18699 1726882367.77727: done checking for any_errors_fatal 18699 1726882367.77728: checking for max_fail_percentage 18699 1726882367.77730: done checking for max_fail_percentage 18699 1726882367.77731: checking to see if all hosts have failed and the running result is not ok 18699 1726882367.77732: done checking to see if all hosts have failed 18699 1726882367.77732: getting the remaining hosts for this loop 18699 1726882367.77734: done getting the remaining hosts for this loop 18699 1726882367.77737: getting the next task for host managed_node1 18699 1726882367.77746: done getting next task for host managed_node1 18699 1726882367.77749: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 18699 1726882367.77752: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882367.77755: getting variables 18699 1726882367.77756: in VariableManager get_vars() 18699 1726882367.77783: Calling all_inventory to load vars for managed_node1 18699 1726882367.77786: Calling groups_inventory to load vars for managed_node1 18699 1726882367.77789: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882367.77810: Calling all_plugins_play to load vars for managed_node1 18699 1726882367.77813: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882367.77816: Calling groups_plugins_play to load vars for managed_node1 18699 1726882367.78739: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882367.79633: done with get_vars() 18699 1726882367.79653: done getting variables 18699 1726882367.79701: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 18699 1726882367.79817: variable 'profile' from source: include params 18699 1726882367.79821: variable 'interface' from source: set_fact 18699 1726882367.79879: variable 'interface' from source: set_fact TASK [Get the ansible_managed comment in ifcfg-lsr27] ************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 21:32:47 -0400 (0:00:00.034) 0:00:41.395 ****** 18699 1726882367.79917: entering _queue_task() for managed_node1/command 18699 1726882367.80210: worker is 1 (out of 1 available) 18699 1726882367.80224: exiting _queue_task() for managed_node1/command 18699 1726882367.80236: done queuing things up, now waiting for results queue to drain 18699 1726882367.80237: waiting for pending results... 18699 1726882367.80713: running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-lsr27 18699 1726882367.80718: in run() - task 12673a56-9f93-1ce6-d207-000000000508 18699 1726882367.80723: variable 'ansible_search_path' from source: unknown 18699 1726882367.80728: variable 'ansible_search_path' from source: unknown 18699 1726882367.80731: calling self._execute() 18699 1726882367.80825: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882367.80829: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882367.80849: variable 'omit' from source: magic vars 18699 1726882367.81290: variable 'ansible_distribution_major_version' from source: facts 18699 1726882367.81295: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882367.81354: variable 'profile_stat' from source: set_fact 18699 1726882367.81367: Evaluated conditional (profile_stat.stat.exists): False 18699 1726882367.81370: when evaluation is False, skipping this task 18699 1726882367.81373: _execute() done 18699 1726882367.81375: dumping result to json 18699 1726882367.81378: done dumping result, returning 18699 1726882367.81385: done running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-lsr27 [12673a56-9f93-1ce6-d207-000000000508] 18699 1726882367.81400: sending task result for task 12673a56-9f93-1ce6-d207-000000000508 18699 1726882367.81478: done sending task result for task 12673a56-9f93-1ce6-d207-000000000508 18699 1726882367.81481: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 18699 1726882367.81551: no more pending results, returning what we have 18699 1726882367.81555: results queue empty 18699 1726882367.81556: checking for any_errors_fatal 18699 1726882367.81561: done checking for any_errors_fatal 18699 1726882367.81562: checking for max_fail_percentage 18699 1726882367.81564: done checking for max_fail_percentage 18699 1726882367.81564: checking to see if all hosts have failed and the running result is not ok 18699 1726882367.81565: done checking to see if all hosts have failed 18699 1726882367.81566: getting the remaining hosts for this loop 18699 1726882367.81567: done getting the remaining hosts for this loop 18699 1726882367.81571: getting the next task for host managed_node1 18699 1726882367.81578: done getting next task for host managed_node1 18699 1726882367.81582: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 18699 1726882367.81586: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882367.81590: getting variables 18699 1726882367.81595: in VariableManager get_vars() 18699 1726882367.81625: Calling all_inventory to load vars for managed_node1 18699 1726882367.81628: Calling groups_inventory to load vars for managed_node1 18699 1726882367.81631: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882367.81644: Calling all_plugins_play to load vars for managed_node1 18699 1726882367.81647: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882367.81649: Calling groups_plugins_play to load vars for managed_node1 18699 1726882367.83218: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882367.85864: done with get_vars() 18699 1726882367.85896: done getting variables 18699 1726882367.86003: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 18699 1726882367.86153: variable 'profile' from source: include params 18699 1726882367.86157: variable 'interface' from source: set_fact 18699 1726882367.86221: variable 'interface' from source: set_fact TASK [Verify the ansible_managed comment in ifcfg-lsr27] *********************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 21:32:47 -0400 (0:00:00.063) 0:00:41.458 ****** 18699 1726882367.86255: entering _queue_task() for managed_node1/set_fact 18699 1726882367.86616: worker is 1 (out of 1 available) 18699 1726882367.86629: exiting _queue_task() for managed_node1/set_fact 18699 1726882367.86641: done queuing things up, now waiting for results queue to drain 18699 1726882367.86642: waiting for pending results... 18699 1726882367.87011: running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-lsr27 18699 1726882367.87081: in run() - task 12673a56-9f93-1ce6-d207-000000000509 18699 1726882367.87085: variable 'ansible_search_path' from source: unknown 18699 1726882367.87088: variable 'ansible_search_path' from source: unknown 18699 1726882367.87171: calling self._execute() 18699 1726882367.87227: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882367.87231: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882367.87289: variable 'omit' from source: magic vars 18699 1726882367.87844: variable 'ansible_distribution_major_version' from source: facts 18699 1726882367.87848: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882367.87877: variable 'profile_stat' from source: set_fact 18699 1726882367.87920: Evaluated conditional (profile_stat.stat.exists): False 18699 1726882367.87923: when evaluation is False, skipping this task 18699 1726882367.87925: _execute() done 18699 1726882367.87928: dumping result to json 18699 1726882367.87930: done dumping result, returning 18699 1726882367.87933: done running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-lsr27 [12673a56-9f93-1ce6-d207-000000000509] 18699 1726882367.87935: sending task result for task 12673a56-9f93-1ce6-d207-000000000509 18699 1726882367.88197: done sending task result for task 12673a56-9f93-1ce6-d207-000000000509 18699 1726882367.88200: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 18699 1726882367.88339: no more pending results, returning what we have 18699 1726882367.88343: results queue empty 18699 1726882367.88344: checking for any_errors_fatal 18699 1726882367.88351: done checking for any_errors_fatal 18699 1726882367.88352: checking for max_fail_percentage 18699 1726882367.88354: done checking for max_fail_percentage 18699 1726882367.88355: checking to see if all hosts have failed and the running result is not ok 18699 1726882367.88355: done checking to see if all hosts have failed 18699 1726882367.88356: getting the remaining hosts for this loop 18699 1726882367.88358: done getting the remaining hosts for this loop 18699 1726882367.88361: getting the next task for host managed_node1 18699 1726882367.88367: done getting next task for host managed_node1 18699 1726882367.88370: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 18699 1726882367.88374: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882367.88378: getting variables 18699 1726882367.88380: in VariableManager get_vars() 18699 1726882367.88415: Calling all_inventory to load vars for managed_node1 18699 1726882367.88418: Calling groups_inventory to load vars for managed_node1 18699 1726882367.88423: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882367.88435: Calling all_plugins_play to load vars for managed_node1 18699 1726882367.88439: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882367.88442: Calling groups_plugins_play to load vars for managed_node1 18699 1726882367.90380: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882367.92591: done with get_vars() 18699 1726882367.92609: done getting variables 18699 1726882367.92651: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 18699 1726882367.92734: variable 'profile' from source: include params 18699 1726882367.92737: variable 'interface' from source: set_fact 18699 1726882367.92775: variable 'interface' from source: set_fact TASK [Get the fingerprint comment in ifcfg-lsr27] ****************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 21:32:47 -0400 (0:00:00.065) 0:00:41.524 ****** 18699 1726882367.92801: entering _queue_task() for managed_node1/command 18699 1726882367.93030: worker is 1 (out of 1 available) 18699 1726882367.93043: exiting _queue_task() for managed_node1/command 18699 1726882367.93052: done queuing things up, now waiting for results queue to drain 18699 1726882367.93053: waiting for pending results... 18699 1726882367.93232: running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-lsr27 18699 1726882367.93317: in run() - task 12673a56-9f93-1ce6-d207-00000000050a 18699 1726882367.93329: variable 'ansible_search_path' from source: unknown 18699 1726882367.93333: variable 'ansible_search_path' from source: unknown 18699 1726882367.93360: calling self._execute() 18699 1726882367.93437: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882367.93441: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882367.93449: variable 'omit' from source: magic vars 18699 1726882367.93723: variable 'ansible_distribution_major_version' from source: facts 18699 1726882367.93733: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882367.93813: variable 'profile_stat' from source: set_fact 18699 1726882367.93827: Evaluated conditional (profile_stat.stat.exists): False 18699 1726882367.93831: when evaluation is False, skipping this task 18699 1726882367.93836: _execute() done 18699 1726882367.93838: dumping result to json 18699 1726882367.93841: done dumping result, returning 18699 1726882367.93844: done running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-lsr27 [12673a56-9f93-1ce6-d207-00000000050a] 18699 1726882367.93872: sending task result for task 12673a56-9f93-1ce6-d207-00000000050a 18699 1726882367.93943: done sending task result for task 12673a56-9f93-1ce6-d207-00000000050a 18699 1726882367.93946: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 18699 1726882367.94003: no more pending results, returning what we have 18699 1726882367.94007: results queue empty 18699 1726882367.94008: checking for any_errors_fatal 18699 1726882367.94015: done checking for any_errors_fatal 18699 1726882367.94015: checking for max_fail_percentage 18699 1726882367.94017: done checking for max_fail_percentage 18699 1726882367.94018: checking to see if all hosts have failed and the running result is not ok 18699 1726882367.94018: done checking to see if all hosts have failed 18699 1726882367.94020: getting the remaining hosts for this loop 18699 1726882367.94022: done getting the remaining hosts for this loop 18699 1726882367.94025: getting the next task for host managed_node1 18699 1726882367.94031: done getting next task for host managed_node1 18699 1726882367.94033: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 18699 1726882367.94037: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882367.94040: getting variables 18699 1726882367.94041: in VariableManager get_vars() 18699 1726882367.94068: Calling all_inventory to load vars for managed_node1 18699 1726882367.94070: Calling groups_inventory to load vars for managed_node1 18699 1726882367.94073: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882367.94092: Calling all_plugins_play to load vars for managed_node1 18699 1726882367.94097: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882367.94101: Calling groups_plugins_play to load vars for managed_node1 18699 1726882367.96117: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882367.98332: done with get_vars() 18699 1726882367.98359: done getting variables 18699 1726882367.98590: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 18699 1726882367.98704: variable 'profile' from source: include params 18699 1726882367.98708: variable 'interface' from source: set_fact 18699 1726882367.98767: variable 'interface' from source: set_fact TASK [Verify the fingerprint comment in ifcfg-lsr27] *************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 21:32:47 -0400 (0:00:00.059) 0:00:41.586 ****** 18699 1726882367.99023: entering _queue_task() for managed_node1/set_fact 18699 1726882367.99568: worker is 1 (out of 1 available) 18699 1726882367.99582: exiting _queue_task() for managed_node1/set_fact 18699 1726882367.99826: done queuing things up, now waiting for results queue to drain 18699 1726882367.99827: waiting for pending results... 18699 1726882368.00147: running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-lsr27 18699 1726882368.00400: in run() - task 12673a56-9f93-1ce6-d207-00000000050b 18699 1726882368.00404: variable 'ansible_search_path' from source: unknown 18699 1726882368.00407: variable 'ansible_search_path' from source: unknown 18699 1726882368.00410: calling self._execute() 18699 1726882368.00708: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882368.00742: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882368.00744: variable 'omit' from source: magic vars 18699 1726882368.01243: variable 'ansible_distribution_major_version' from source: facts 18699 1726882368.01298: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882368.01387: variable 'profile_stat' from source: set_fact 18699 1726882368.01413: Evaluated conditional (profile_stat.stat.exists): False 18699 1726882368.01421: when evaluation is False, skipping this task 18699 1726882368.01434: _execute() done 18699 1726882368.01443: dumping result to json 18699 1726882368.01450: done dumping result, returning 18699 1726882368.01498: done running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-lsr27 [12673a56-9f93-1ce6-d207-00000000050b] 18699 1726882368.01501: sending task result for task 12673a56-9f93-1ce6-d207-00000000050b 18699 1726882368.01737: done sending task result for task 12673a56-9f93-1ce6-d207-00000000050b 18699 1726882368.01740: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 18699 1726882368.01785: no more pending results, returning what we have 18699 1726882368.01788: results queue empty 18699 1726882368.01789: checking for any_errors_fatal 18699 1726882368.01799: done checking for any_errors_fatal 18699 1726882368.01800: checking for max_fail_percentage 18699 1726882368.01801: done checking for max_fail_percentage 18699 1726882368.01802: checking to see if all hosts have failed and the running result is not ok 18699 1726882368.01803: done checking to see if all hosts have failed 18699 1726882368.01804: getting the remaining hosts for this loop 18699 1726882368.01805: done getting the remaining hosts for this loop 18699 1726882368.01809: getting the next task for host managed_node1 18699 1726882368.01817: done getting next task for host managed_node1 18699 1726882368.01821: ^ task is: TASK: Assert that the profile is absent - '{{ profile }}' 18699 1726882368.01824: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882368.01829: getting variables 18699 1726882368.01830: in VariableManager get_vars() 18699 1726882368.01861: Calling all_inventory to load vars for managed_node1 18699 1726882368.01863: Calling groups_inventory to load vars for managed_node1 18699 1726882368.01867: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882368.01878: Calling all_plugins_play to load vars for managed_node1 18699 1726882368.01881: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882368.01883: Calling groups_plugins_play to load vars for managed_node1 18699 1726882368.03611: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882368.05262: done with get_vars() 18699 1726882368.05286: done getting variables 18699 1726882368.05348: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 18699 1726882368.05462: variable 'profile' from source: include params 18699 1726882368.05466: variable 'interface' from source: set_fact 18699 1726882368.05525: variable 'interface' from source: set_fact TASK [Assert that the profile is absent - 'lsr27'] ***************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:5 Friday 20 September 2024 21:32:48 -0400 (0:00:00.065) 0:00:41.651 ****** 18699 1726882368.05556: entering _queue_task() for managed_node1/assert 18699 1726882368.06122: worker is 1 (out of 1 available) 18699 1726882368.06131: exiting _queue_task() for managed_node1/assert 18699 1726882368.06140: done queuing things up, now waiting for results queue to drain 18699 1726882368.06141: waiting for pending results... 18699 1726882368.06199: running TaskExecutor() for managed_node1/TASK: Assert that the profile is absent - 'lsr27' 18699 1726882368.06309: in run() - task 12673a56-9f93-1ce6-d207-0000000004f6 18699 1726882368.06329: variable 'ansible_search_path' from source: unknown 18699 1726882368.06335: variable 'ansible_search_path' from source: unknown 18699 1726882368.06376: calling self._execute() 18699 1726882368.06502: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882368.06514: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882368.06535: variable 'omit' from source: magic vars 18699 1726882368.07127: variable 'ansible_distribution_major_version' from source: facts 18699 1726882368.07146: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882368.07157: variable 'omit' from source: magic vars 18699 1726882368.07204: variable 'omit' from source: magic vars 18699 1726882368.07311: variable 'profile' from source: include params 18699 1726882368.07323: variable 'interface' from source: set_fact 18699 1726882368.07453: variable 'interface' from source: set_fact 18699 1726882368.07456: variable 'omit' from source: magic vars 18699 1726882368.07472: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18699 1726882368.07516: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18699 1726882368.07544: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18699 1726882368.07571: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882368.07587: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882368.07626: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18699 1726882368.07636: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882368.07645: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882368.07779: Set connection var ansible_connection to ssh 18699 1726882368.07783: Set connection var ansible_pipelining to False 18699 1726882368.07785: Set connection var ansible_shell_executable to /bin/sh 18699 1726882368.07787: Set connection var ansible_timeout to 10 18699 1726882368.07790: Set connection var ansible_shell_type to sh 18699 1726882368.07794: Set connection var ansible_module_compression to ZIP_DEFLATED 18699 1726882368.07827: variable 'ansible_shell_executable' from source: unknown 18699 1726882368.07835: variable 'ansible_connection' from source: unknown 18699 1726882368.07888: variable 'ansible_module_compression' from source: unknown 18699 1726882368.07891: variable 'ansible_shell_type' from source: unknown 18699 1726882368.07894: variable 'ansible_shell_executable' from source: unknown 18699 1726882368.07897: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882368.07899: variable 'ansible_pipelining' from source: unknown 18699 1726882368.07901: variable 'ansible_timeout' from source: unknown 18699 1726882368.07903: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882368.08026: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18699 1726882368.08043: variable 'omit' from source: magic vars 18699 1726882368.08053: starting attempt loop 18699 1726882368.08061: running the handler 18699 1726882368.08181: variable 'lsr_net_profile_exists' from source: set_fact 18699 1726882368.08214: Evaluated conditional (not lsr_net_profile_exists): True 18699 1726882368.08217: handler run complete 18699 1726882368.08226: attempt loop complete, returning result 18699 1726882368.08232: _execute() done 18699 1726882368.08238: dumping result to json 18699 1726882368.08323: done dumping result, returning 18699 1726882368.08326: done running TaskExecutor() for managed_node1/TASK: Assert that the profile is absent - 'lsr27' [12673a56-9f93-1ce6-d207-0000000004f6] 18699 1726882368.08329: sending task result for task 12673a56-9f93-1ce6-d207-0000000004f6 18699 1726882368.08391: done sending task result for task 12673a56-9f93-1ce6-d207-0000000004f6 18699 1726882368.08396: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 18699 1726882368.08471: no more pending results, returning what we have 18699 1726882368.08474: results queue empty 18699 1726882368.08476: checking for any_errors_fatal 18699 1726882368.08482: done checking for any_errors_fatal 18699 1726882368.08482: checking for max_fail_percentage 18699 1726882368.08484: done checking for max_fail_percentage 18699 1726882368.08485: checking to see if all hosts have failed and the running result is not ok 18699 1726882368.08486: done checking to see if all hosts have failed 18699 1726882368.08486: getting the remaining hosts for this loop 18699 1726882368.08488: done getting the remaining hosts for this loop 18699 1726882368.08492: getting the next task for host managed_node1 18699 1726882368.08503: done getting next task for host managed_node1 18699 1726882368.08507: ^ task is: TASK: Include the task 'assert_device_absent.yml' 18699 1726882368.08509: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882368.08513: getting variables 18699 1726882368.08515: in VariableManager get_vars() 18699 1726882368.08544: Calling all_inventory to load vars for managed_node1 18699 1726882368.08547: Calling groups_inventory to load vars for managed_node1 18699 1726882368.08550: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882368.08561: Calling all_plugins_play to load vars for managed_node1 18699 1726882368.08564: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882368.08567: Calling groups_plugins_play to load vars for managed_node1 18699 1726882368.10292: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882368.11856: done with get_vars() 18699 1726882368.11877: done getting variables TASK [Include the task 'assert_device_absent.yml'] ***************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:75 Friday 20 September 2024 21:32:48 -0400 (0:00:00.064) 0:00:41.715 ****** 18699 1726882368.11966: entering _queue_task() for managed_node1/include_tasks 18699 1726882368.12264: worker is 1 (out of 1 available) 18699 1726882368.12274: exiting _queue_task() for managed_node1/include_tasks 18699 1726882368.12286: done queuing things up, now waiting for results queue to drain 18699 1726882368.12287: waiting for pending results... 18699 1726882368.12713: running TaskExecutor() for managed_node1/TASK: Include the task 'assert_device_absent.yml' 18699 1726882368.12719: in run() - task 12673a56-9f93-1ce6-d207-000000000075 18699 1726882368.12723: variable 'ansible_search_path' from source: unknown 18699 1726882368.12726: calling self._execute() 18699 1726882368.12800: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882368.12813: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882368.12830: variable 'omit' from source: magic vars 18699 1726882368.13211: variable 'ansible_distribution_major_version' from source: facts 18699 1726882368.13230: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882368.13241: _execute() done 18699 1726882368.13251: dumping result to json 18699 1726882368.13259: done dumping result, returning 18699 1726882368.13268: done running TaskExecutor() for managed_node1/TASK: Include the task 'assert_device_absent.yml' [12673a56-9f93-1ce6-d207-000000000075] 18699 1726882368.13284: sending task result for task 12673a56-9f93-1ce6-d207-000000000075 18699 1726882368.13524: no more pending results, returning what we have 18699 1726882368.13530: in VariableManager get_vars() 18699 1726882368.13563: Calling all_inventory to load vars for managed_node1 18699 1726882368.13566: Calling groups_inventory to load vars for managed_node1 18699 1726882368.13569: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882368.13583: Calling all_plugins_play to load vars for managed_node1 18699 1726882368.13586: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882368.13589: Calling groups_plugins_play to load vars for managed_node1 18699 1726882368.14206: done sending task result for task 12673a56-9f93-1ce6-d207-000000000075 18699 1726882368.14210: WORKER PROCESS EXITING 18699 1726882368.15035: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882368.16603: done with get_vars() 18699 1726882368.16624: variable 'ansible_search_path' from source: unknown 18699 1726882368.16639: we have included files to process 18699 1726882368.16640: generating all_blocks data 18699 1726882368.16643: done generating all_blocks data 18699 1726882368.16649: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 18699 1726882368.16650: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 18699 1726882368.16653: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 18699 1726882368.16816: in VariableManager get_vars() 18699 1726882368.16832: done with get_vars() 18699 1726882368.16943: done processing included file 18699 1726882368.16945: iterating over new_blocks loaded from include file 18699 1726882368.16946: in VariableManager get_vars() 18699 1726882368.16957: done with get_vars() 18699 1726882368.16959: filtering new block on tags 18699 1726882368.16977: done filtering new block on tags 18699 1726882368.16979: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed_node1 18699 1726882368.16984: extending task lists for all hosts with included blocks 18699 1726882368.17132: done extending task lists 18699 1726882368.17133: done processing included files 18699 1726882368.17134: results queue empty 18699 1726882368.17135: checking for any_errors_fatal 18699 1726882368.17138: done checking for any_errors_fatal 18699 1726882368.17139: checking for max_fail_percentage 18699 1726882368.17140: done checking for max_fail_percentage 18699 1726882368.17141: checking to see if all hosts have failed and the running result is not ok 18699 1726882368.17141: done checking to see if all hosts have failed 18699 1726882368.17142: getting the remaining hosts for this loop 18699 1726882368.17143: done getting the remaining hosts for this loop 18699 1726882368.17146: getting the next task for host managed_node1 18699 1726882368.17149: done getting next task for host managed_node1 18699 1726882368.17151: ^ task is: TASK: Include the task 'get_interface_stat.yml' 18699 1726882368.17154: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882368.17156: getting variables 18699 1726882368.17157: in VariableManager get_vars() 18699 1726882368.17166: Calling all_inventory to load vars for managed_node1 18699 1726882368.17168: Calling groups_inventory to load vars for managed_node1 18699 1726882368.17170: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882368.17175: Calling all_plugins_play to load vars for managed_node1 18699 1726882368.17178: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882368.17180: Calling groups_plugins_play to load vars for managed_node1 18699 1726882368.18465: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882368.19991: done with get_vars() 18699 1726882368.20016: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Friday 20 September 2024 21:32:48 -0400 (0:00:00.081) 0:00:41.796 ****** 18699 1726882368.20087: entering _queue_task() for managed_node1/include_tasks 18699 1726882368.20439: worker is 1 (out of 1 available) 18699 1726882368.20452: exiting _queue_task() for managed_node1/include_tasks 18699 1726882368.20462: done queuing things up, now waiting for results queue to drain 18699 1726882368.20463: waiting for pending results... 18699 1726882368.20734: running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' 18699 1726882368.20855: in run() - task 12673a56-9f93-1ce6-d207-00000000053c 18699 1726882368.20871: variable 'ansible_search_path' from source: unknown 18699 1726882368.20878: variable 'ansible_search_path' from source: unknown 18699 1726882368.20923: calling self._execute() 18699 1726882368.21020: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882368.21032: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882368.21044: variable 'omit' from source: magic vars 18699 1726882368.21413: variable 'ansible_distribution_major_version' from source: facts 18699 1726882368.21430: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882368.21439: _execute() done 18699 1726882368.21445: dumping result to json 18699 1726882368.21451: done dumping result, returning 18699 1726882368.21463: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' [12673a56-9f93-1ce6-d207-00000000053c] 18699 1726882368.21471: sending task result for task 12673a56-9f93-1ce6-d207-00000000053c 18699 1726882368.21802: done sending task result for task 12673a56-9f93-1ce6-d207-00000000053c 18699 1726882368.21805: WORKER PROCESS EXITING 18699 1726882368.21827: no more pending results, returning what we have 18699 1726882368.21831: in VariableManager get_vars() 18699 1726882368.21859: Calling all_inventory to load vars for managed_node1 18699 1726882368.21862: Calling groups_inventory to load vars for managed_node1 18699 1726882368.21865: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882368.21876: Calling all_plugins_play to load vars for managed_node1 18699 1726882368.21879: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882368.21883: Calling groups_plugins_play to load vars for managed_node1 18699 1726882368.23317: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882368.24863: done with get_vars() 18699 1726882368.24885: variable 'ansible_search_path' from source: unknown 18699 1726882368.24886: variable 'ansible_search_path' from source: unknown 18699 1726882368.24925: we have included files to process 18699 1726882368.24927: generating all_blocks data 18699 1726882368.24928: done generating all_blocks data 18699 1726882368.24929: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 18699 1726882368.24931: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 18699 1726882368.24933: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 18699 1726882368.25120: done processing included file 18699 1726882368.25122: iterating over new_blocks loaded from include file 18699 1726882368.25123: in VariableManager get_vars() 18699 1726882368.25136: done with get_vars() 18699 1726882368.25138: filtering new block on tags 18699 1726882368.25152: done filtering new block on tags 18699 1726882368.25154: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node1 18699 1726882368.25159: extending task lists for all hosts with included blocks 18699 1726882368.25258: done extending task lists 18699 1726882368.25259: done processing included files 18699 1726882368.25260: results queue empty 18699 1726882368.25261: checking for any_errors_fatal 18699 1726882368.25263: done checking for any_errors_fatal 18699 1726882368.25264: checking for max_fail_percentage 18699 1726882368.25265: done checking for max_fail_percentage 18699 1726882368.25265: checking to see if all hosts have failed and the running result is not ok 18699 1726882368.25266: done checking to see if all hosts have failed 18699 1726882368.25267: getting the remaining hosts for this loop 18699 1726882368.25268: done getting the remaining hosts for this loop 18699 1726882368.25270: getting the next task for host managed_node1 18699 1726882368.25274: done getting next task for host managed_node1 18699 1726882368.25277: ^ task is: TASK: Get stat for interface {{ interface }} 18699 1726882368.25279: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882368.25282: getting variables 18699 1726882368.25283: in VariableManager get_vars() 18699 1726882368.25291: Calling all_inventory to load vars for managed_node1 18699 1726882368.25295: Calling groups_inventory to load vars for managed_node1 18699 1726882368.25297: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882368.25302: Calling all_plugins_play to load vars for managed_node1 18699 1726882368.25305: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882368.25308: Calling groups_plugins_play to load vars for managed_node1 18699 1726882368.26458: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882368.27999: done with get_vars() 18699 1726882368.28022: done getting variables 18699 1726882368.28176: variable 'interface' from source: set_fact TASK [Get stat for interface lsr27] ******************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:32:48 -0400 (0:00:00.081) 0:00:41.878 ****** 18699 1726882368.28209: entering _queue_task() for managed_node1/stat 18699 1726882368.28566: worker is 1 (out of 1 available) 18699 1726882368.28579: exiting _queue_task() for managed_node1/stat 18699 1726882368.28590: done queuing things up, now waiting for results queue to drain 18699 1726882368.28591: waiting for pending results... 18699 1726882368.28877: running TaskExecutor() for managed_node1/TASK: Get stat for interface lsr27 18699 1726882368.28998: in run() - task 12673a56-9f93-1ce6-d207-000000000554 18699 1726882368.29023: variable 'ansible_search_path' from source: unknown 18699 1726882368.29030: variable 'ansible_search_path' from source: unknown 18699 1726882368.29070: calling self._execute() 18699 1726882368.29172: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882368.29184: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882368.29202: variable 'omit' from source: magic vars 18699 1726882368.29568: variable 'ansible_distribution_major_version' from source: facts 18699 1726882368.29587: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882368.29602: variable 'omit' from source: magic vars 18699 1726882368.29656: variable 'omit' from source: magic vars 18699 1726882368.29760: variable 'interface' from source: set_fact 18699 1726882368.29782: variable 'omit' from source: magic vars 18699 1726882368.29833: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18699 1726882368.29874: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18699 1726882368.29905: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18699 1726882368.29929: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882368.29946: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882368.29981: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18699 1726882368.29990: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882368.30001: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882368.30106: Set connection var ansible_connection to ssh 18699 1726882368.30125: Set connection var ansible_pipelining to False 18699 1726882368.30198: Set connection var ansible_shell_executable to /bin/sh 18699 1726882368.30201: Set connection var ansible_timeout to 10 18699 1726882368.30203: Set connection var ansible_shell_type to sh 18699 1726882368.30205: Set connection var ansible_module_compression to ZIP_DEFLATED 18699 1726882368.30207: variable 'ansible_shell_executable' from source: unknown 18699 1726882368.30209: variable 'ansible_connection' from source: unknown 18699 1726882368.30211: variable 'ansible_module_compression' from source: unknown 18699 1726882368.30213: variable 'ansible_shell_type' from source: unknown 18699 1726882368.30215: variable 'ansible_shell_executable' from source: unknown 18699 1726882368.30216: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882368.30219: variable 'ansible_pipelining' from source: unknown 18699 1726882368.30223: variable 'ansible_timeout' from source: unknown 18699 1726882368.30235: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882368.30438: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18699 1726882368.30460: variable 'omit' from source: magic vars 18699 1726882368.30471: starting attempt loop 18699 1726882368.30477: running the handler 18699 1726882368.30496: _low_level_execute_command(): starting 18699 1726882368.30557: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18699 1726882368.31314: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882368.31358: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882368.31385: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882368.31456: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882368.33158: stdout chunk (state=3): >>>/root <<< 18699 1726882368.33326: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882368.33330: stdout chunk (state=3): >>><<< 18699 1726882368.33332: stderr chunk (state=3): >>><<< 18699 1726882368.33445: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882368.33450: _low_level_execute_command(): starting 18699 1726882368.33453: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882368.3335829-20693-81867922137540 `" && echo ansible-tmp-1726882368.3335829-20693-81867922137540="` echo /root/.ansible/tmp/ansible-tmp-1726882368.3335829-20693-81867922137540 `" ) && sleep 0' 18699 1726882368.34009: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882368.34115: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882368.34140: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882368.34155: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882368.34174: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882368.34247: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882368.36104: stdout chunk (state=3): >>>ansible-tmp-1726882368.3335829-20693-81867922137540=/root/.ansible/tmp/ansible-tmp-1726882368.3335829-20693-81867922137540 <<< 18699 1726882368.36246: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882368.36256: stdout chunk (state=3): >>><<< 18699 1726882368.36267: stderr chunk (state=3): >>><<< 18699 1726882368.36403: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882368.3335829-20693-81867922137540=/root/.ansible/tmp/ansible-tmp-1726882368.3335829-20693-81867922137540 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882368.36406: variable 'ansible_module_compression' from source: unknown 18699 1726882368.36409: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18699f6i6z5dg/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 18699 1726882368.36442: variable 'ansible_facts' from source: unknown 18699 1726882368.36538: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882368.3335829-20693-81867922137540/AnsiballZ_stat.py 18699 1726882368.36749: Sending initial data 18699 1726882368.36752: Sent initial data (152 bytes) 18699 1726882368.37379: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882368.37429: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882368.37463: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882368.38966: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18699 1726882368.39022: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18699 1726882368.39066: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmp1j1lslrm /root/.ansible/tmp/ansible-tmp-1726882368.3335829-20693-81867922137540/AnsiballZ_stat.py <<< 18699 1726882368.39069: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882368.3335829-20693-81867922137540/AnsiballZ_stat.py" <<< 18699 1726882368.39133: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmp1j1lslrm" to remote "/root/.ansible/tmp/ansible-tmp-1726882368.3335829-20693-81867922137540/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882368.3335829-20693-81867922137540/AnsiballZ_stat.py" <<< 18699 1726882368.39733: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882368.39769: stderr chunk (state=3): >>><<< 18699 1726882368.39773: stdout chunk (state=3): >>><<< 18699 1726882368.39814: done transferring module to remote 18699 1726882368.39822: _low_level_execute_command(): starting 18699 1726882368.39826: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882368.3335829-20693-81867922137540/ /root/.ansible/tmp/ansible-tmp-1726882368.3335829-20693-81867922137540/AnsiballZ_stat.py && sleep 0' 18699 1726882368.40331: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882368.40335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882368.40337: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882368.40384: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882368.40387: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882368.40436: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882368.42140: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882368.42229: stderr chunk (state=3): >>><<< 18699 1726882368.42235: stdout chunk (state=3): >>><<< 18699 1726882368.42238: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882368.42245: _low_level_execute_command(): starting 18699 1726882368.42248: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882368.3335829-20693-81867922137540/AnsiballZ_stat.py && sleep 0' 18699 1726882368.42768: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882368.42783: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882368.42786: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18699 1726882368.42789: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882368.42791: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882368.42811: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882368.42866: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882368.42920: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882368.57847: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/lsr27", "follow": false, "checksum_algorithm": "sha1"}}} <<< 18699 1726882368.59019: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 18699 1726882368.59044: stderr chunk (state=3): >>><<< 18699 1726882368.59048: stdout chunk (state=3): >>><<< 18699 1726882368.59065: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/lsr27", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 18699 1726882368.59086: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/lsr27', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882368.3335829-20693-81867922137540/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18699 1726882368.59098: _low_level_execute_command(): starting 18699 1726882368.59117: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882368.3335829-20693-81867922137540/ > /dev/null 2>&1 && sleep 0' 18699 1726882368.59589: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882368.59594: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882368.59597: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882368.59599: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 18699 1726882368.59601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882368.59652: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882368.59659: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882368.59662: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882368.59704: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882368.61536: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882368.61539: stdout chunk (state=3): >>><<< 18699 1726882368.61541: stderr chunk (state=3): >>><<< 18699 1726882368.61558: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882368.61561: handler run complete 18699 1726882368.61589: attempt loop complete, returning result 18699 1726882368.61601: _execute() done 18699 1726882368.61604: dumping result to json 18699 1726882368.61606: done dumping result, returning 18699 1726882368.61608: done running TaskExecutor() for managed_node1/TASK: Get stat for interface lsr27 [12673a56-9f93-1ce6-d207-000000000554] 18699 1726882368.61610: sending task result for task 12673a56-9f93-1ce6-d207-000000000554 18699 1726882368.61748: done sending task result for task 12673a56-9f93-1ce6-d207-000000000554 18699 1726882368.61751: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 18699 1726882368.61814: no more pending results, returning what we have 18699 1726882368.61818: results queue empty 18699 1726882368.61819: checking for any_errors_fatal 18699 1726882368.61821: done checking for any_errors_fatal 18699 1726882368.61821: checking for max_fail_percentage 18699 1726882368.61823: done checking for max_fail_percentage 18699 1726882368.61824: checking to see if all hosts have failed and the running result is not ok 18699 1726882368.61825: done checking to see if all hosts have failed 18699 1726882368.61825: getting the remaining hosts for this loop 18699 1726882368.61827: done getting the remaining hosts for this loop 18699 1726882368.61831: getting the next task for host managed_node1 18699 1726882368.61838: done getting next task for host managed_node1 18699 1726882368.61841: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 18699 1726882368.61843: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882368.61848: getting variables 18699 1726882368.61849: in VariableManager get_vars() 18699 1726882368.61879: Calling all_inventory to load vars for managed_node1 18699 1726882368.61881: Calling groups_inventory to load vars for managed_node1 18699 1726882368.61884: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882368.61933: Calling all_plugins_play to load vars for managed_node1 18699 1726882368.61937: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882368.61941: Calling groups_plugins_play to load vars for managed_node1 18699 1726882368.63217: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882368.64399: done with get_vars() 18699 1726882368.64427: done getting variables 18699 1726882368.64505: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 18699 1726882368.64631: variable 'interface' from source: set_fact TASK [Assert that the interface is absent - 'lsr27'] *************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Friday 20 September 2024 21:32:48 -0400 (0:00:00.364) 0:00:42.242 ****** 18699 1726882368.64661: entering _queue_task() for managed_node1/assert 18699 1726882368.65034: worker is 1 (out of 1 available) 18699 1726882368.65048: exiting _queue_task() for managed_node1/assert 18699 1726882368.65058: done queuing things up, now waiting for results queue to drain 18699 1726882368.65059: waiting for pending results... 18699 1726882368.65555: running TaskExecutor() for managed_node1/TASK: Assert that the interface is absent - 'lsr27' 18699 1726882368.65561: in run() - task 12673a56-9f93-1ce6-d207-00000000053d 18699 1726882368.65564: variable 'ansible_search_path' from source: unknown 18699 1726882368.65566: variable 'ansible_search_path' from source: unknown 18699 1726882368.65569: calling self._execute() 18699 1726882368.65760: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882368.65764: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882368.65767: variable 'omit' from source: magic vars 18699 1726882368.66150: variable 'ansible_distribution_major_version' from source: facts 18699 1726882368.66186: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882368.66194: variable 'omit' from source: magic vars 18699 1726882368.66229: variable 'omit' from source: magic vars 18699 1726882368.66332: variable 'interface' from source: set_fact 18699 1726882368.66349: variable 'omit' from source: magic vars 18699 1726882368.66397: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18699 1726882368.66469: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18699 1726882368.66474: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18699 1726882368.66599: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882368.66652: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882368.66713: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18699 1726882368.66725: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882368.66775: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882368.67045: Set connection var ansible_connection to ssh 18699 1726882368.67091: Set connection var ansible_pipelining to False 18699 1726882368.67097: Set connection var ansible_shell_executable to /bin/sh 18699 1726882368.67100: Set connection var ansible_timeout to 10 18699 1726882368.67103: Set connection var ansible_shell_type to sh 18699 1726882368.67105: Set connection var ansible_module_compression to ZIP_DEFLATED 18699 1726882368.67128: variable 'ansible_shell_executable' from source: unknown 18699 1726882368.67137: variable 'ansible_connection' from source: unknown 18699 1726882368.67153: variable 'ansible_module_compression' from source: unknown 18699 1726882368.67155: variable 'ansible_shell_type' from source: unknown 18699 1726882368.67201: variable 'ansible_shell_executable' from source: unknown 18699 1726882368.67204: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882368.67206: variable 'ansible_pipelining' from source: unknown 18699 1726882368.67208: variable 'ansible_timeout' from source: unknown 18699 1726882368.67210: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882368.67357: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18699 1726882368.67367: variable 'omit' from source: magic vars 18699 1726882368.67372: starting attempt loop 18699 1726882368.67375: running the handler 18699 1726882368.67509: variable 'interface_stat' from source: set_fact 18699 1726882368.67517: Evaluated conditional (not interface_stat.stat.exists): True 18699 1726882368.67522: handler run complete 18699 1726882368.67534: attempt loop complete, returning result 18699 1726882368.67537: _execute() done 18699 1726882368.67541: dumping result to json 18699 1726882368.67544: done dumping result, returning 18699 1726882368.67548: done running TaskExecutor() for managed_node1/TASK: Assert that the interface is absent - 'lsr27' [12673a56-9f93-1ce6-d207-00000000053d] 18699 1726882368.67553: sending task result for task 12673a56-9f93-1ce6-d207-00000000053d 18699 1726882368.67638: done sending task result for task 12673a56-9f93-1ce6-d207-00000000053d 18699 1726882368.67641: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 18699 1726882368.67699: no more pending results, returning what we have 18699 1726882368.67702: results queue empty 18699 1726882368.67703: checking for any_errors_fatal 18699 1726882368.67710: done checking for any_errors_fatal 18699 1726882368.67710: checking for max_fail_percentage 18699 1726882368.67712: done checking for max_fail_percentage 18699 1726882368.67713: checking to see if all hosts have failed and the running result is not ok 18699 1726882368.67714: done checking to see if all hosts have failed 18699 1726882368.67714: getting the remaining hosts for this loop 18699 1726882368.67716: done getting the remaining hosts for this loop 18699 1726882368.67719: getting the next task for host managed_node1 18699 1726882368.67728: done getting next task for host managed_node1 18699 1726882368.67730: ^ task is: TASK: meta (flush_handlers) 18699 1726882368.67731: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882368.67736: getting variables 18699 1726882368.67737: in VariableManager get_vars() 18699 1726882368.67766: Calling all_inventory to load vars for managed_node1 18699 1726882368.67769: Calling groups_inventory to load vars for managed_node1 18699 1726882368.67772: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882368.67782: Calling all_plugins_play to load vars for managed_node1 18699 1726882368.67785: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882368.67787: Calling groups_plugins_play to load vars for managed_node1 18699 1726882368.69919: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882368.71952: done with get_vars() 18699 1726882368.71980: done getting variables 18699 1726882368.72055: in VariableManager get_vars() 18699 1726882368.72065: Calling all_inventory to load vars for managed_node1 18699 1726882368.72067: Calling groups_inventory to load vars for managed_node1 18699 1726882368.72070: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882368.72074: Calling all_plugins_play to load vars for managed_node1 18699 1726882368.72076: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882368.72079: Calling groups_plugins_play to load vars for managed_node1 18699 1726882368.73460: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882368.75604: done with get_vars() 18699 1726882368.75637: done queuing things up, now waiting for results queue to drain 18699 1726882368.75642: results queue empty 18699 1726882368.75643: checking for any_errors_fatal 18699 1726882368.75645: done checking for any_errors_fatal 18699 1726882368.75646: checking for max_fail_percentage 18699 1726882368.75647: done checking for max_fail_percentage 18699 1726882368.75648: checking to see if all hosts have failed and the running result is not ok 18699 1726882368.75648: done checking to see if all hosts have failed 18699 1726882368.75657: getting the remaining hosts for this loop 18699 1726882368.75658: done getting the remaining hosts for this loop 18699 1726882368.75663: getting the next task for host managed_node1 18699 1726882368.75667: done getting next task for host managed_node1 18699 1726882368.75668: ^ task is: TASK: meta (flush_handlers) 18699 1726882368.75670: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882368.75673: getting variables 18699 1726882368.75674: in VariableManager get_vars() 18699 1726882368.75683: Calling all_inventory to load vars for managed_node1 18699 1726882368.75685: Calling groups_inventory to load vars for managed_node1 18699 1726882368.75688: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882368.75705: Calling all_plugins_play to load vars for managed_node1 18699 1726882368.75710: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882368.75716: Calling groups_plugins_play to load vars for managed_node1 18699 1726882368.77224: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882368.79111: done with get_vars() 18699 1726882368.79131: done getting variables 18699 1726882368.79188: in VariableManager get_vars() 18699 1726882368.79201: Calling all_inventory to load vars for managed_node1 18699 1726882368.79204: Calling groups_inventory to load vars for managed_node1 18699 1726882368.79206: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882368.79211: Calling all_plugins_play to load vars for managed_node1 18699 1726882368.79213: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882368.79216: Calling groups_plugins_play to load vars for managed_node1 18699 1726882368.80389: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882368.81934: done with get_vars() 18699 1726882368.81955: done queuing things up, now waiting for results queue to drain 18699 1726882368.81956: results queue empty 18699 1726882368.81957: checking for any_errors_fatal 18699 1726882368.81958: done checking for any_errors_fatal 18699 1726882368.81958: checking for max_fail_percentage 18699 1726882368.81959: done checking for max_fail_percentage 18699 1726882368.81959: checking to see if all hosts have failed and the running result is not ok 18699 1726882368.81960: done checking to see if all hosts have failed 18699 1726882368.81960: getting the remaining hosts for this loop 18699 1726882368.81961: done getting the remaining hosts for this loop 18699 1726882368.81963: getting the next task for host managed_node1 18699 1726882368.81965: done getting next task for host managed_node1 18699 1726882368.81966: ^ task is: None 18699 1726882368.81967: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882368.81968: done queuing things up, now waiting for results queue to drain 18699 1726882368.81968: results queue empty 18699 1726882368.81969: checking for any_errors_fatal 18699 1726882368.81969: done checking for any_errors_fatal 18699 1726882368.81970: checking for max_fail_percentage 18699 1726882368.81970: done checking for max_fail_percentage 18699 1726882368.81971: checking to see if all hosts have failed and the running result is not ok 18699 1726882368.81971: done checking to see if all hosts have failed 18699 1726882368.81972: getting the next task for host managed_node1 18699 1726882368.81973: done getting next task for host managed_node1 18699 1726882368.81973: ^ task is: None 18699 1726882368.81974: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882368.82014: in VariableManager get_vars() 18699 1726882368.82026: done with get_vars() 18699 1726882368.82030: in VariableManager get_vars() 18699 1726882368.82036: done with get_vars() 18699 1726882368.82038: variable 'omit' from source: magic vars 18699 1726882368.82060: in VariableManager get_vars() 18699 1726882368.82065: done with get_vars() 18699 1726882368.82079: variable 'omit' from source: magic vars PLAY [Verify that cleanup restored state to default] *************************** 18699 1726882368.82238: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 18699 1726882368.82258: getting the remaining hosts for this loop 18699 1726882368.82259: done getting the remaining hosts for this loop 18699 1726882368.82261: getting the next task for host managed_node1 18699 1726882368.82262: done getting next task for host managed_node1 18699 1726882368.82264: ^ task is: TASK: Gathering Facts 18699 1726882368.82265: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882368.82266: getting variables 18699 1726882368.82266: in VariableManager get_vars() 18699 1726882368.82272: Calling all_inventory to load vars for managed_node1 18699 1726882368.82274: Calling groups_inventory to load vars for managed_node1 18699 1726882368.82275: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882368.82279: Calling all_plugins_play to load vars for managed_node1 18699 1726882368.82281: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882368.82282: Calling groups_plugins_play to load vars for managed_node1 18699 1726882368.83018: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882368.84273: done with get_vars() 18699 1726882368.84297: done getting variables 18699 1726882368.84342: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:77 Friday 20 September 2024 21:32:48 -0400 (0:00:00.197) 0:00:42.439 ****** 18699 1726882368.84371: entering _queue_task() for managed_node1/gather_facts 18699 1726882368.84641: worker is 1 (out of 1 available) 18699 1726882368.84652: exiting _queue_task() for managed_node1/gather_facts 18699 1726882368.84661: done queuing things up, now waiting for results queue to drain 18699 1726882368.84662: waiting for pending results... 18699 1726882368.84845: running TaskExecutor() for managed_node1/TASK: Gathering Facts 18699 1726882368.84914: in run() - task 12673a56-9f93-1ce6-d207-00000000056d 18699 1726882368.84926: variable 'ansible_search_path' from source: unknown 18699 1726882368.84954: calling self._execute() 18699 1726882368.85031: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882368.85035: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882368.85042: variable 'omit' from source: magic vars 18699 1726882368.85315: variable 'ansible_distribution_major_version' from source: facts 18699 1726882368.85328: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882368.85336: variable 'omit' from source: magic vars 18699 1726882368.85354: variable 'omit' from source: magic vars 18699 1726882368.85379: variable 'omit' from source: magic vars 18699 1726882368.85414: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18699 1726882368.85445: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18699 1726882368.85460: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18699 1726882368.85474: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882368.85484: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882368.85512: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18699 1726882368.85516: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882368.85519: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882368.85587: Set connection var ansible_connection to ssh 18699 1726882368.85595: Set connection var ansible_pipelining to False 18699 1726882368.85603: Set connection var ansible_shell_executable to /bin/sh 18699 1726882368.85608: Set connection var ansible_timeout to 10 18699 1726882368.85610: Set connection var ansible_shell_type to sh 18699 1726882368.85615: Set connection var ansible_module_compression to ZIP_DEFLATED 18699 1726882368.85637: variable 'ansible_shell_executable' from source: unknown 18699 1726882368.85640: variable 'ansible_connection' from source: unknown 18699 1726882368.85642: variable 'ansible_module_compression' from source: unknown 18699 1726882368.85647: variable 'ansible_shell_type' from source: unknown 18699 1726882368.85650: variable 'ansible_shell_executable' from source: unknown 18699 1726882368.85652: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882368.85656: variable 'ansible_pipelining' from source: unknown 18699 1726882368.85658: variable 'ansible_timeout' from source: unknown 18699 1726882368.85660: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882368.85798: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18699 1726882368.85808: variable 'omit' from source: magic vars 18699 1726882368.85812: starting attempt loop 18699 1726882368.85814: running the handler 18699 1726882368.85828: variable 'ansible_facts' from source: unknown 18699 1726882368.85842: _low_level_execute_command(): starting 18699 1726882368.85849: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18699 1726882368.86467: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882368.86513: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882368.86552: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882368.88184: stdout chunk (state=3): >>>/root <<< 18699 1726882368.88310: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882368.88314: stdout chunk (state=3): >>><<< 18699 1726882368.88321: stderr chunk (state=3): >>><<< 18699 1726882368.88342: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882368.88353: _low_level_execute_command(): starting 18699 1726882368.88359: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882368.8834045-20721-43466428775786 `" && echo ansible-tmp-1726882368.8834045-20721-43466428775786="` echo /root/.ansible/tmp/ansible-tmp-1726882368.8834045-20721-43466428775786 `" ) && sleep 0' 18699 1726882368.88797: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882368.88801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882368.88804: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882368.88814: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882368.88854: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882368.88858: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882368.88908: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882368.90777: stdout chunk (state=3): >>>ansible-tmp-1726882368.8834045-20721-43466428775786=/root/.ansible/tmp/ansible-tmp-1726882368.8834045-20721-43466428775786 <<< 18699 1726882368.90890: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882368.90898: stdout chunk (state=3): >>><<< 18699 1726882368.90901: stderr chunk (state=3): >>><<< 18699 1726882368.90917: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882368.8834045-20721-43466428775786=/root/.ansible/tmp/ansible-tmp-1726882368.8834045-20721-43466428775786 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882368.90955: variable 'ansible_module_compression' from source: unknown 18699 1726882368.90999: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18699f6i6z5dg/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 18699 1726882368.91200: variable 'ansible_facts' from source: unknown 18699 1726882368.91276: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882368.8834045-20721-43466428775786/AnsiballZ_setup.py 18699 1726882368.91448: Sending initial data 18699 1726882368.91458: Sent initial data (153 bytes) 18699 1726882368.92118: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882368.92211: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882368.92249: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882368.92267: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882368.92298: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882368.92374: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882368.93901: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 18699 1726882368.93934: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18699 1726882368.93999: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18699 1726882368.94110: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmpcaxw7sv8 /root/.ansible/tmp/ansible-tmp-1726882368.8834045-20721-43466428775786/AnsiballZ_setup.py <<< 18699 1726882368.94113: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882368.8834045-20721-43466428775786/AnsiballZ_setup.py" <<< 18699 1726882368.94205: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmpcaxw7sv8" to remote "/root/.ansible/tmp/ansible-tmp-1726882368.8834045-20721-43466428775786/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882368.8834045-20721-43466428775786/AnsiballZ_setup.py" <<< 18699 1726882368.95776: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882368.95828: stderr chunk (state=3): >>><<< 18699 1726882368.95841: stdout chunk (state=3): >>><<< 18699 1726882368.95901: done transferring module to remote 18699 1726882368.95904: _low_level_execute_command(): starting 18699 1726882368.95906: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882368.8834045-20721-43466428775786/ /root/.ansible/tmp/ansible-tmp-1726882368.8834045-20721-43466428775786/AnsiballZ_setup.py && sleep 0' 18699 1726882368.96465: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882368.96468: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882368.96480: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882368.96536: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882368.96539: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882368.96586: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882368.98322: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882368.98325: stderr chunk (state=3): >>><<< 18699 1726882368.98327: stdout chunk (state=3): >>><<< 18699 1726882368.98417: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882368.98421: _low_level_execute_command(): starting 18699 1726882368.98423: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882368.8834045-20721-43466428775786/AnsiballZ_setup.py && sleep 0' 18699 1726882368.98963: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882368.98974: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882368.99010: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882368.99058: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882368.99108: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882368.99120: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882368.99422: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882368.99469: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882369.61135: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-159", "ansible_nodename": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d2d02cced42c36436217cb93f6b8e", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC9sgyYGKGPd0JFIDKIZZNkcX78Ca8OmX4GnOCt150Ftpgzzfir9Dy2HOb7d6QbQheoi9HLkHb66U2LDdt7EnBGKnI12YAuydTDfqITc2L4W9cEeoy/f2rrMlBo6FN3SNQc2voCDsWius2gK2mtTTZZI0R33PguMmqTkwYVzP0hYplwSYh5Atl+XP7/xLRhhowanh9U6x2ahqfnNq5DInqi070bKk0xZ2g12Vg8kIRno8ZQmm+ujUUevRkZysHvnrnN01ZQhqzjo/Awn+Pft6LYleTBn+YU/HlPMWR4PsFcrtT3WRdF5samSvVwWuuOC+0td2zQN4nGpYLK+FmpNG4nDfGZV/xIBBblNRvzrhKgk3lDU5qkeQ/R0godRQGbv4J1kq+3WU2E3upqBYxXWUJLM5FirAxz8tKLmaPh8YZWMKcs3X9F2ySLEcnhe5R5F6LFSNx13zQSt7lGZOIgzhvWllcs4YVxcV1Y4rTJ8jEK2KgWua+bZinJPYUJqKTzO2E=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKk0X8hfHP7BSAAI8BDwrr4175ddN6MsanEqlp3oVMOvThKVXLpFXhvJPbq2IBTd3Wm12dL2vAW7/82zG63KYZk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDVN13dHSxa36Blsqt/Q8OyOA04CC7ZlvrS6zWL4aDyE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fips": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_iscsi_iqn": "", "ansible_local": {}, "ansible_is_chroot": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "32", "second": "49", "epoch": "1726882369", "epoch_int": "1726882369", "date": "2024-09-20", "time": "21:32:49", "iso8601_micro": "2024-09-21T01:32:49.266843Z", "iso8601": "2024-09-21T01:32:49Z", "iso8601_basic": "20240920T213249266843", "iso8601_basic_short": "20240920T213249", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_lsb": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_fibre_channel_wwn": [], "ansible_loadavg": {"1m": 0.43603515625, "5m": 0.3291015625, "15m": 0.166015625}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS<<< 18699 1726882369.61172: stdout chunk (state=3): >>>_COLORS": "", "SSH_CONNECTION": "10.31.11.248 52586 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 52586 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_apparmor": {"status": "disabled"}, "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1030:bff:fea1:4223", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::1030:bff:fea1:4223"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1030:bff:fea1:4223"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2953, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 578, "free": 2953}, "nocache": {"free": 3291, "used": 240}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_uuid": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 802, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794832384, "block_size": 4096, "block_total": 65519099, "block_available": 63914754, "block_used": 1604345, "inode_total": 131070960, "inode_available": 131029047, "inode_used": 41913, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 18699 1726882369.63125: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 18699 1726882369.63199: stdout chunk (state=3): >>><<< 18699 1726882369.63203: stderr chunk (state=3): >>><<< 18699 1726882369.63206: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-159", "ansible_nodename": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d2d02cced42c36436217cb93f6b8e", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC9sgyYGKGPd0JFIDKIZZNkcX78Ca8OmX4GnOCt150Ftpgzzfir9Dy2HOb7d6QbQheoi9HLkHb66U2LDdt7EnBGKnI12YAuydTDfqITc2L4W9cEeoy/f2rrMlBo6FN3SNQc2voCDsWius2gK2mtTTZZI0R33PguMmqTkwYVzP0hYplwSYh5Atl+XP7/xLRhhowanh9U6x2ahqfnNq5DInqi070bKk0xZ2g12Vg8kIRno8ZQmm+ujUUevRkZysHvnrnN01ZQhqzjo/Awn+Pft6LYleTBn+YU/HlPMWR4PsFcrtT3WRdF5samSvVwWuuOC+0td2zQN4nGpYLK+FmpNG4nDfGZV/xIBBblNRvzrhKgk3lDU5qkeQ/R0godRQGbv4J1kq+3WU2E3upqBYxXWUJLM5FirAxz8tKLmaPh8YZWMKcs3X9F2ySLEcnhe5R5F6LFSNx13zQSt7lGZOIgzhvWllcs4YVxcV1Y4rTJ8jEK2KgWua+bZinJPYUJqKTzO2E=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKk0X8hfHP7BSAAI8BDwrr4175ddN6MsanEqlp3oVMOvThKVXLpFXhvJPbq2IBTd3Wm12dL2vAW7/82zG63KYZk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDVN13dHSxa36Blsqt/Q8OyOA04CC7ZlvrS6zWL4aDyE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fips": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_iscsi_iqn": "", "ansible_local": {}, "ansible_is_chroot": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "32", "second": "49", "epoch": "1726882369", "epoch_int": "1726882369", "date": "2024-09-20", "time": "21:32:49", "iso8601_micro": "2024-09-21T01:32:49.266843Z", "iso8601": "2024-09-21T01:32:49Z", "iso8601_basic": "20240920T213249266843", "iso8601_basic_short": "20240920T213249", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_lsb": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_fibre_channel_wwn": [], "ansible_loadavg": {"1m": 0.43603515625, "5m": 0.3291015625, "15m": 0.166015625}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 52586 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 52586 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_apparmor": {"status": "disabled"}, "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1030:bff:fea1:4223", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::1030:bff:fea1:4223"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1030:bff:fea1:4223"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2953, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 578, "free": 2953}, "nocache": {"free": 3291, "used": 240}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_uuid": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 802, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794832384, "block_size": 4096, "block_total": 65519099, "block_available": 63914754, "block_used": 1604345, "inode_total": 131070960, "inode_available": 131029047, "inode_used": 41913, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 18699 1726882369.63560: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882368.8834045-20721-43466428775786/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18699 1726882369.63588: _low_level_execute_command(): starting 18699 1726882369.63600: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882368.8834045-20721-43466428775786/ > /dev/null 2>&1 && sleep 0' 18699 1726882369.64201: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882369.64216: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882369.64232: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882369.64255: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18699 1726882369.64315: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882369.64377: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882369.64394: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882369.64428: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882369.64508: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882369.66351: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882369.66357: stdout chunk (state=3): >>><<< 18699 1726882369.66359: stderr chunk (state=3): >>><<< 18699 1726882369.66499: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882369.66503: handler run complete 18699 1726882369.66537: variable 'ansible_facts' from source: unknown 18699 1726882369.66662: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882369.67007: variable 'ansible_facts' from source: unknown 18699 1726882369.67111: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882369.67254: attempt loop complete, returning result 18699 1726882369.67262: _execute() done 18699 1726882369.67275: dumping result to json 18699 1726882369.67313: done dumping result, returning 18699 1726882369.67324: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [12673a56-9f93-1ce6-d207-00000000056d] 18699 1726882369.67386: sending task result for task 12673a56-9f93-1ce6-d207-00000000056d ok: [managed_node1] 18699 1726882369.68213: no more pending results, returning what we have 18699 1726882369.68246: results queue empty 18699 1726882369.68248: checking for any_errors_fatal 18699 1726882369.68249: done checking for any_errors_fatal 18699 1726882369.68250: checking for max_fail_percentage 18699 1726882369.68252: done checking for max_fail_percentage 18699 1726882369.68253: checking to see if all hosts have failed and the running result is not ok 18699 1726882369.68253: done checking to see if all hosts have failed 18699 1726882369.68254: getting the remaining hosts for this loop 18699 1726882369.68255: done getting the remaining hosts for this loop 18699 1726882369.68259: getting the next task for host managed_node1 18699 1726882369.68264: done getting next task for host managed_node1 18699 1726882369.68266: ^ task is: TASK: meta (flush_handlers) 18699 1726882369.68268: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882369.68272: getting variables 18699 1726882369.68273: in VariableManager get_vars() 18699 1726882369.68301: Calling all_inventory to load vars for managed_node1 18699 1726882369.68304: Calling groups_inventory to load vars for managed_node1 18699 1726882369.68307: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882369.68412: Calling all_plugins_play to load vars for managed_node1 18699 1726882369.68431: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882369.68478: done sending task result for task 12673a56-9f93-1ce6-d207-00000000056d 18699 1726882369.68481: WORKER PROCESS EXITING 18699 1726882369.68486: Calling groups_plugins_play to load vars for managed_node1 18699 1726882369.70382: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882369.76459: done with get_vars() 18699 1726882369.76482: done getting variables 18699 1726882369.76549: in VariableManager get_vars() 18699 1726882369.76558: Calling all_inventory to load vars for managed_node1 18699 1726882369.76560: Calling groups_inventory to load vars for managed_node1 18699 1726882369.76562: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882369.76567: Calling all_plugins_play to load vars for managed_node1 18699 1726882369.76569: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882369.76572: Calling groups_plugins_play to load vars for managed_node1 18699 1726882369.77726: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882369.79311: done with get_vars() 18699 1726882369.79342: done queuing things up, now waiting for results queue to drain 18699 1726882369.79344: results queue empty 18699 1726882369.79345: checking for any_errors_fatal 18699 1726882369.79349: done checking for any_errors_fatal 18699 1726882369.79350: checking for max_fail_percentage 18699 1726882369.79355: done checking for max_fail_percentage 18699 1726882369.79356: checking to see if all hosts have failed and the running result is not ok 18699 1726882369.79356: done checking to see if all hosts have failed 18699 1726882369.79357: getting the remaining hosts for this loop 18699 1726882369.79358: done getting the remaining hosts for this loop 18699 1726882369.79361: getting the next task for host managed_node1 18699 1726882369.79365: done getting next task for host managed_node1 18699 1726882369.79367: ^ task is: TASK: Verify network state restored to default 18699 1726882369.79369: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882369.79371: getting variables 18699 1726882369.79372: in VariableManager get_vars() 18699 1726882369.79381: Calling all_inventory to load vars for managed_node1 18699 1726882369.79383: Calling groups_inventory to load vars for managed_node1 18699 1726882369.79385: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882369.79390: Calling all_plugins_play to load vars for managed_node1 18699 1726882369.79397: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882369.79401: Calling groups_plugins_play to load vars for managed_node1 18699 1726882369.80609: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882369.82216: done with get_vars() 18699 1726882369.82239: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:80 Friday 20 September 2024 21:32:49 -0400 (0:00:00.979) 0:00:43.419 ****** 18699 1726882369.82318: entering _queue_task() for managed_node1/include_tasks 18699 1726882369.82678: worker is 1 (out of 1 available) 18699 1726882369.82690: exiting _queue_task() for managed_node1/include_tasks 18699 1726882369.82711: done queuing things up, now waiting for results queue to drain 18699 1726882369.82713: waiting for pending results... 18699 1726882369.82967: running TaskExecutor() for managed_node1/TASK: Verify network state restored to default 18699 1726882369.83077: in run() - task 12673a56-9f93-1ce6-d207-000000000078 18699 1726882369.83100: variable 'ansible_search_path' from source: unknown 18699 1726882369.83138: calling self._execute() 18699 1726882369.83237: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882369.83302: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882369.83305: variable 'omit' from source: magic vars 18699 1726882369.83633: variable 'ansible_distribution_major_version' from source: facts 18699 1726882369.83649: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882369.83658: _execute() done 18699 1726882369.83665: dumping result to json 18699 1726882369.83673: done dumping result, returning 18699 1726882369.83682: done running TaskExecutor() for managed_node1/TASK: Verify network state restored to default [12673a56-9f93-1ce6-d207-000000000078] 18699 1726882369.83690: sending task result for task 12673a56-9f93-1ce6-d207-000000000078 18699 1726882369.83838: no more pending results, returning what we have 18699 1726882369.83842: in VariableManager get_vars() 18699 1726882369.83875: Calling all_inventory to load vars for managed_node1 18699 1726882369.83877: Calling groups_inventory to load vars for managed_node1 18699 1726882369.83880: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882369.83897: Calling all_plugins_play to load vars for managed_node1 18699 1726882369.83900: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882369.83903: Calling groups_plugins_play to load vars for managed_node1 18699 1726882369.84510: done sending task result for task 12673a56-9f93-1ce6-d207-000000000078 18699 1726882369.84514: WORKER PROCESS EXITING 18699 1726882369.85402: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882369.87070: done with get_vars() 18699 1726882369.87092: variable 'ansible_search_path' from source: unknown 18699 1726882369.87109: we have included files to process 18699 1726882369.87110: generating all_blocks data 18699 1726882369.87112: done generating all_blocks data 18699 1726882369.87113: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 18699 1726882369.87114: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 18699 1726882369.87116: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 18699 1726882369.87526: done processing included file 18699 1726882369.87528: iterating over new_blocks loaded from include file 18699 1726882369.87530: in VariableManager get_vars() 18699 1726882369.87542: done with get_vars() 18699 1726882369.87544: filtering new block on tags 18699 1726882369.87561: done filtering new block on tags 18699 1726882369.87564: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed_node1 18699 1726882369.87569: extending task lists for all hosts with included blocks 18699 1726882369.87600: done extending task lists 18699 1726882369.87601: done processing included files 18699 1726882369.87602: results queue empty 18699 1726882369.87603: checking for any_errors_fatal 18699 1726882369.87604: done checking for any_errors_fatal 18699 1726882369.87605: checking for max_fail_percentage 18699 1726882369.87606: done checking for max_fail_percentage 18699 1726882369.87607: checking to see if all hosts have failed and the running result is not ok 18699 1726882369.87608: done checking to see if all hosts have failed 18699 1726882369.87608: getting the remaining hosts for this loop 18699 1726882369.87609: done getting the remaining hosts for this loop 18699 1726882369.87612: getting the next task for host managed_node1 18699 1726882369.87616: done getting next task for host managed_node1 18699 1726882369.87618: ^ task is: TASK: Check routes and DNS 18699 1726882369.87621: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882369.87623: getting variables 18699 1726882369.87624: in VariableManager get_vars() 18699 1726882369.87632: Calling all_inventory to load vars for managed_node1 18699 1726882369.87635: Calling groups_inventory to load vars for managed_node1 18699 1726882369.87637: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882369.87642: Calling all_plugins_play to load vars for managed_node1 18699 1726882369.87645: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882369.87648: Calling groups_plugins_play to load vars for managed_node1 18699 1726882369.88858: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882369.90481: done with get_vars() 18699 1726882369.90504: done getting variables 18699 1726882369.90542: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Friday 20 September 2024 21:32:49 -0400 (0:00:00.082) 0:00:43.501 ****** 18699 1726882369.90569: entering _queue_task() for managed_node1/shell 18699 1726882369.90920: worker is 1 (out of 1 available) 18699 1726882369.90933: exiting _queue_task() for managed_node1/shell 18699 1726882369.90943: done queuing things up, now waiting for results queue to drain 18699 1726882369.90943: waiting for pending results... 18699 1726882369.91160: running TaskExecutor() for managed_node1/TASK: Check routes and DNS 18699 1726882369.91268: in run() - task 12673a56-9f93-1ce6-d207-00000000057e 18699 1726882369.91294: variable 'ansible_search_path' from source: unknown 18699 1726882369.91303: variable 'ansible_search_path' from source: unknown 18699 1726882369.91340: calling self._execute() 18699 1726882369.91438: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882369.91449: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882369.91461: variable 'omit' from source: magic vars 18699 1726882369.91926: variable 'ansible_distribution_major_version' from source: facts 18699 1726882369.91929: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882369.91931: variable 'omit' from source: magic vars 18699 1726882369.91934: variable 'omit' from source: magic vars 18699 1726882369.91937: variable 'omit' from source: magic vars 18699 1726882369.91980: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18699 1726882369.92022: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18699 1726882369.92053: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18699 1726882369.92075: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882369.92091: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882369.92126: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18699 1726882369.92134: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882369.92147: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882369.92247: Set connection var ansible_connection to ssh 18699 1726882369.92264: Set connection var ansible_pipelining to False 18699 1726882369.92273: Set connection var ansible_shell_executable to /bin/sh 18699 1726882369.92282: Set connection var ansible_timeout to 10 18699 1726882369.92290: Set connection var ansible_shell_type to sh 18699 1726882369.92301: Set connection var ansible_module_compression to ZIP_DEFLATED 18699 1726882369.92332: variable 'ansible_shell_executable' from source: unknown 18699 1726882369.92362: variable 'ansible_connection' from source: unknown 18699 1726882369.92365: variable 'ansible_module_compression' from source: unknown 18699 1726882369.92367: variable 'ansible_shell_type' from source: unknown 18699 1726882369.92369: variable 'ansible_shell_executable' from source: unknown 18699 1726882369.92371: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882369.92373: variable 'ansible_pipelining' from source: unknown 18699 1726882369.92375: variable 'ansible_timeout' from source: unknown 18699 1726882369.92377: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882369.92580: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18699 1726882369.92584: variable 'omit' from source: magic vars 18699 1726882369.92586: starting attempt loop 18699 1726882369.92588: running the handler 18699 1726882369.92591: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18699 1726882369.92594: _low_level_execute_command(): starting 18699 1726882369.92597: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18699 1726882369.93291: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882369.93313: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882369.93328: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882369.93412: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882369.93454: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882369.93470: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882369.93495: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882369.93569: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882369.95220: stdout chunk (state=3): >>>/root <<< 18699 1726882369.95357: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882369.95361: stdout chunk (state=3): >>><<< 18699 1726882369.95364: stderr chunk (state=3): >>><<< 18699 1726882369.95385: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882369.95411: _low_level_execute_command(): starting 18699 1726882369.95422: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882369.9539728-20758-233508552241198 `" && echo ansible-tmp-1726882369.9539728-20758-233508552241198="` echo /root/.ansible/tmp/ansible-tmp-1726882369.9539728-20758-233508552241198 `" ) && sleep 0' 18699 1726882369.96615: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882369.96619: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882369.96732: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882369.96755: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882369.96829: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882369.98714: stdout chunk (state=3): >>>ansible-tmp-1726882369.9539728-20758-233508552241198=/root/.ansible/tmp/ansible-tmp-1726882369.9539728-20758-233508552241198 <<< 18699 1726882369.98827: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882369.99099: stdout chunk (state=3): >>><<< 18699 1726882369.99105: stderr chunk (state=3): >>><<< 18699 1726882369.99108: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882369.9539728-20758-233508552241198=/root/.ansible/tmp/ansible-tmp-1726882369.9539728-20758-233508552241198 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882369.99111: variable 'ansible_module_compression' from source: unknown 18699 1726882369.99199: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18699f6i6z5dg/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 18699 1726882369.99202: variable 'ansible_facts' from source: unknown 18699 1726882369.99351: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882369.9539728-20758-233508552241198/AnsiballZ_command.py 18699 1726882369.99794: Sending initial data 18699 1726882369.99799: Sent initial data (156 bytes) 18699 1726882370.00909: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882370.00969: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882370.00988: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882370.01060: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882370.02582: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18699 1726882370.02639: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18699 1726882370.02698: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmpnzuc51az /root/.ansible/tmp/ansible-tmp-1726882369.9539728-20758-233508552241198/AnsiballZ_command.py <<< 18699 1726882370.02708: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882369.9539728-20758-233508552241198/AnsiballZ_command.py" <<< 18699 1726882370.02770: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmpnzuc51az" to remote "/root/.ansible/tmp/ansible-tmp-1726882369.9539728-20758-233508552241198/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882369.9539728-20758-233508552241198/AnsiballZ_command.py" <<< 18699 1726882370.04058: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882370.04072: stdout chunk (state=3): >>><<< 18699 1726882370.04083: stderr chunk (state=3): >>><<< 18699 1726882370.04123: done transferring module to remote 18699 1726882370.04149: _low_level_execute_command(): starting 18699 1726882370.04183: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882369.9539728-20758-233508552241198/ /root/.ansible/tmp/ansible-tmp-1726882369.9539728-20758-233508552241198/AnsiballZ_command.py && sleep 0' 18699 1726882370.05382: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882370.05611: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882370.05694: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882370.07479: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882370.07482: stdout chunk (state=3): >>><<< 18699 1726882370.07492: stderr chunk (state=3): >>><<< 18699 1726882370.07522: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882370.07528: _low_level_execute_command(): starting 18699 1726882370.07532: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882369.9539728-20758-233508552241198/AnsiballZ_command.py && sleep 0' 18699 1726882370.08562: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882370.08565: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882370.08806: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882370.08810: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882370.08812: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882370.08858: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882370.08990: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882370.24818: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:30:0b:a1:42:23 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.9.159/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 2816sec preferred_lft 2816sec\n inet6 fe80::1030:bff:fea1:4223/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.9.159 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.9.159 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 21:32:50.236939", "end": "2024-09-20 21:32:50.245182", "delta": "0:00:00.008243", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 18699 1726882370.26020: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882370.26039: stderr chunk (state=3): >>>Shared connection to 10.31.9.159 closed. <<< 18699 1726882370.26097: stderr chunk (state=3): >>><<< 18699 1726882370.26108: stdout chunk (state=3): >>><<< 18699 1726882370.26153: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:30:0b:a1:42:23 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.9.159/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 2816sec preferred_lft 2816sec\n inet6 fe80::1030:bff:fea1:4223/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.9.159 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.9.159 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 21:32:50.236939", "end": "2024-09-20 21:32:50.245182", "delta": "0:00:00.008243", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 18699 1726882370.26355: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882369.9539728-20758-233508552241198/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18699 1726882370.26362: _low_level_execute_command(): starting 18699 1726882370.26365: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882369.9539728-20758-233508552241198/ > /dev/null 2>&1 && sleep 0' 18699 1726882370.27567: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18699 1726882370.27580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882370.27669: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882370.27828: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882370.27831: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882370.27885: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882370.27960: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882370.29901: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882370.29905: stdout chunk (state=3): >>><<< 18699 1726882370.29907: stderr chunk (state=3): >>><<< 18699 1726882370.29910: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882370.29912: handler run complete 18699 1726882370.29914: Evaluated conditional (False): False 18699 1726882370.29916: attempt loop complete, returning result 18699 1726882370.29918: _execute() done 18699 1726882370.29920: dumping result to json 18699 1726882370.29922: done dumping result, returning 18699 1726882370.29924: done running TaskExecutor() for managed_node1/TASK: Check routes and DNS [12673a56-9f93-1ce6-d207-00000000057e] 18699 1726882370.29926: sending task result for task 12673a56-9f93-1ce6-d207-00000000057e ok: [managed_node1] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.008243", "end": "2024-09-20 21:32:50.245182", "rc": 0, "start": "2024-09-20 21:32:50.236939" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 12:30:0b:a1:42:23 brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.9.159/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0 valid_lft 2816sec preferred_lft 2816sec inet6 fe80::1030:bff:fea1:4223/64 scope link noprefixroute valid_lft forever preferred_lft forever IP ROUTE default via 10.31.8.1 dev eth0 proto dhcp src 10.31.9.159 metric 100 10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.9.159 metric 100 IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 18699 1726882370.30069: no more pending results, returning what we have 18699 1726882370.30073: results queue empty 18699 1726882370.30074: checking for any_errors_fatal 18699 1726882370.30076: done checking for any_errors_fatal 18699 1726882370.30077: checking for max_fail_percentage 18699 1726882370.30078: done checking for max_fail_percentage 18699 1726882370.30080: checking to see if all hosts have failed and the running result is not ok 18699 1726882370.30080: done checking to see if all hosts have failed 18699 1726882370.30081: getting the remaining hosts for this loop 18699 1726882370.30082: done getting the remaining hosts for this loop 18699 1726882370.30086: getting the next task for host managed_node1 18699 1726882370.30098: done getting next task for host managed_node1 18699 1726882370.30101: ^ task is: TASK: Verify DNS and network connectivity 18699 1726882370.30104: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882370.30108: getting variables 18699 1726882370.30110: in VariableManager get_vars() 18699 1726882370.30141: Calling all_inventory to load vars for managed_node1 18699 1726882370.30144: Calling groups_inventory to load vars for managed_node1 18699 1726882370.30148: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882370.30159: Calling all_plugins_play to load vars for managed_node1 18699 1726882370.30163: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882370.30167: Calling groups_plugins_play to load vars for managed_node1 18699 1726882370.31182: done sending task result for task 12673a56-9f93-1ce6-d207-00000000057e 18699 1726882370.31185: WORKER PROCESS EXITING 18699 1726882370.33356: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882370.36973: done with get_vars() 18699 1726882370.37200: done getting variables 18699 1726882370.37261: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Friday 20 September 2024 21:32:50 -0400 (0:00:00.467) 0:00:43.969 ****** 18699 1726882370.37302: entering _queue_task() for managed_node1/shell 18699 1726882370.38039: worker is 1 (out of 1 available) 18699 1726882370.38050: exiting _queue_task() for managed_node1/shell 18699 1726882370.38060: done queuing things up, now waiting for results queue to drain 18699 1726882370.38061: waiting for pending results... 18699 1726882370.38310: running TaskExecutor() for managed_node1/TASK: Verify DNS and network connectivity 18699 1726882370.38602: in run() - task 12673a56-9f93-1ce6-d207-00000000057f 18699 1726882370.38626: variable 'ansible_search_path' from source: unknown 18699 1726882370.38639: variable 'ansible_search_path' from source: unknown 18699 1726882370.38678: calling self._execute() 18699 1726882370.38967: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882370.38980: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882370.38998: variable 'omit' from source: magic vars 18699 1726882370.39728: variable 'ansible_distribution_major_version' from source: facts 18699 1726882370.39811: Evaluated conditional (ansible_distribution_major_version != '6'): True 18699 1726882370.40072: variable 'ansible_facts' from source: unknown 18699 1726882370.41680: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 18699 1726882370.41769: variable 'omit' from source: magic vars 18699 1726882370.42021: variable 'omit' from source: magic vars 18699 1726882370.42024: variable 'omit' from source: magic vars 18699 1726882370.42027: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18699 1726882370.42399: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18699 1726882370.42402: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18699 1726882370.42405: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882370.42407: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18699 1726882370.42409: variable 'inventory_hostname' from source: host vars for 'managed_node1' 18699 1726882370.42412: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882370.42415: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882370.42417: Set connection var ansible_connection to ssh 18699 1726882370.42612: Set connection var ansible_pipelining to False 18699 1726882370.42626: Set connection var ansible_shell_executable to /bin/sh 18699 1726882370.42637: Set connection var ansible_timeout to 10 18699 1726882370.42644: Set connection var ansible_shell_type to sh 18699 1726882370.42654: Set connection var ansible_module_compression to ZIP_DEFLATED 18699 1726882370.42688: variable 'ansible_shell_executable' from source: unknown 18699 1726882370.42757: variable 'ansible_connection' from source: unknown 18699 1726882370.42766: variable 'ansible_module_compression' from source: unknown 18699 1726882370.42773: variable 'ansible_shell_type' from source: unknown 18699 1726882370.42779: variable 'ansible_shell_executable' from source: unknown 18699 1726882370.42786: variable 'ansible_host' from source: host vars for 'managed_node1' 18699 1726882370.42799: variable 'ansible_pipelining' from source: unknown 18699 1726882370.42808: variable 'ansible_timeout' from source: unknown 18699 1726882370.42832: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 18699 1726882370.43066: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18699 1726882370.43086: variable 'omit' from source: magic vars 18699 1726882370.43102: starting attempt loop 18699 1726882370.43112: running the handler 18699 1726882370.43127: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18699 1726882370.43159: _low_level_execute_command(): starting 18699 1726882370.43172: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18699 1726882370.44016: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882370.44060: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882370.44078: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882370.44155: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882370.44219: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882370.45788: stdout chunk (state=3): >>>/root <<< 18699 1726882370.46101: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882370.46104: stdout chunk (state=3): >>><<< 18699 1726882370.46106: stderr chunk (state=3): >>><<< 18699 1726882370.46109: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882370.46370: _low_level_execute_command(): starting 18699 1726882370.46374: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882370.4608278-20782-193600596368856 `" && echo ansible-tmp-1726882370.4608278-20782-193600596368856="` echo /root/.ansible/tmp/ansible-tmp-1726882370.4608278-20782-193600596368856 `" ) && sleep 0' 18699 1726882370.47526: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882370.47612: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882370.47636: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882370.49505: stdout chunk (state=3): >>>ansible-tmp-1726882370.4608278-20782-193600596368856=/root/.ansible/tmp/ansible-tmp-1726882370.4608278-20782-193600596368856 <<< 18699 1726882370.49606: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882370.49644: stderr chunk (state=3): >>><<< 18699 1726882370.49653: stdout chunk (state=3): >>><<< 18699 1726882370.49710: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882370.4608278-20782-193600596368856=/root/.ansible/tmp/ansible-tmp-1726882370.4608278-20782-193600596368856 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882370.50101: variable 'ansible_module_compression' from source: unknown 18699 1726882370.50105: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-18699f6i6z5dg/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 18699 1726882370.50107: variable 'ansible_facts' from source: unknown 18699 1726882370.50229: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882370.4608278-20782-193600596368856/AnsiballZ_command.py 18699 1726882370.50503: Sending initial data 18699 1726882370.50513: Sent initial data (156 bytes) 18699 1726882370.51807: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882370.51916: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882370.51943: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882370.52085: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882370.53582: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18699 1726882370.53641: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18699 1726882370.53709: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmp784se0ss /root/.ansible/tmp/ansible-tmp-1726882370.4608278-20782-193600596368856/AnsiballZ_command.py <<< 18699 1726882370.53717: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882370.4608278-20782-193600596368856/AnsiballZ_command.py" <<< 18699 1726882370.53782: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-18699f6i6z5dg/tmp784se0ss" to remote "/root/.ansible/tmp/ansible-tmp-1726882370.4608278-20782-193600596368856/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882370.4608278-20782-193600596368856/AnsiballZ_command.py" <<< 18699 1726882370.55122: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882370.55178: stderr chunk (state=3): >>><<< 18699 1726882370.55279: stdout chunk (state=3): >>><<< 18699 1726882370.55282: done transferring module to remote 18699 1726882370.55284: _low_level_execute_command(): starting 18699 1726882370.55288: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882370.4608278-20782-193600596368856/ /root/.ansible/tmp/ansible-tmp-1726882370.4608278-20782-193600596368856/AnsiballZ_command.py && sleep 0' 18699 1726882370.56634: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18699 1726882370.56896: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882370.56912: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882370.56938: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882370.57015: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882370.58862: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882370.58906: stderr chunk (state=3): >>><<< 18699 1726882370.58916: stdout chunk (state=3): >>><<< 18699 1726882370.58943: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882370.58961: _low_level_execute_command(): starting 18699 1726882370.58973: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882370.4608278-20782-193600596368856/AnsiballZ_command.py && sleep 0' 18699 1726882370.59735: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882370.59812: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882370.59873: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 18699 1726882370.59898: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882370.59934: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882370.60014: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882370.83397: stdout chunk (state=3): >>> {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 13589 0 --:--:-- --:--:-- --:--:-- 13863\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 7447 0 --:--:-- --:--:-- --:--:-- 7657", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 21:32:50.749020", "end": "2024-09-20 21:32:50.832737", "delta": "0:00:00.083717", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 18699 1726882370.84842: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 18699 1726882370.84867: stderr chunk (state=3): >>><<< 18699 1726882370.84871: stdout chunk (state=3): >>><<< 18699 1726882370.84890: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 13589 0 --:--:-- --:--:-- --:--:-- 13863\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 7447 0 --:--:-- --:--:-- --:--:-- 7657", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 21:32:50.749020", "end": "2024-09-20 21:32:50.832737", "delta": "0:00:00.083717", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 18699 1726882370.84926: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882370.4608278-20782-193600596368856/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18699 1726882370.84932: _low_level_execute_command(): starting 18699 1726882370.84937: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882370.4608278-20782-193600596368856/ > /dev/null 2>&1 && sleep 0' 18699 1726882370.85366: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882370.85369: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882370.85372: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18699 1726882370.85374: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18699 1726882370.85376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18699 1726882370.85434: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 18699 1726882370.85436: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18699 1726882370.85473: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18699 1726882370.87240: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18699 1726882370.87265: stderr chunk (state=3): >>><<< 18699 1726882370.87268: stdout chunk (state=3): >>><<< 18699 1726882370.87282: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18699 1726882370.87288: handler run complete 18699 1726882370.87309: Evaluated conditional (False): False 18699 1726882370.87317: attempt loop complete, returning result 18699 1726882370.87320: _execute() done 18699 1726882370.87322: dumping result to json 18699 1726882370.87327: done dumping result, returning 18699 1726882370.87334: done running TaskExecutor() for managed_node1/TASK: Verify DNS and network connectivity [12673a56-9f93-1ce6-d207-00000000057f] 18699 1726882370.87337: sending task result for task 12673a56-9f93-1ce6-d207-00000000057f 18699 1726882370.87437: done sending task result for task 12673a56-9f93-1ce6-d207-00000000057f 18699 1726882370.87439: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.083717", "end": "2024-09-20 21:32:50.832737", "rc": 0, "start": "2024-09-20 21:32:50.749020" } STDOUT: CHECK DNS AND CONNECTIVITY 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 13589 0 --:--:-- --:--:-- --:--:-- 13863 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 7447 0 --:--:-- --:--:-- --:--:-- 7657 18699 1726882370.87523: no more pending results, returning what we have 18699 1726882370.87527: results queue empty 18699 1726882370.87528: checking for any_errors_fatal 18699 1726882370.87536: done checking for any_errors_fatal 18699 1726882370.87536: checking for max_fail_percentage 18699 1726882370.87538: done checking for max_fail_percentage 18699 1726882370.87539: checking to see if all hosts have failed and the running result is not ok 18699 1726882370.87540: done checking to see if all hosts have failed 18699 1726882370.87540: getting the remaining hosts for this loop 18699 1726882370.87542: done getting the remaining hosts for this loop 18699 1726882370.87550: getting the next task for host managed_node1 18699 1726882370.87557: done getting next task for host managed_node1 18699 1726882370.87561: ^ task is: TASK: meta (flush_handlers) 18699 1726882370.87562: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882370.87566: getting variables 18699 1726882370.87568: in VariableManager get_vars() 18699 1726882370.87598: Calling all_inventory to load vars for managed_node1 18699 1726882370.87601: Calling groups_inventory to load vars for managed_node1 18699 1726882370.87604: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882370.87615: Calling all_plugins_play to load vars for managed_node1 18699 1726882370.87617: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882370.87621: Calling groups_plugins_play to load vars for managed_node1 18699 1726882370.88430: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882370.89312: done with get_vars() 18699 1726882370.89331: done getting variables 18699 1726882370.89377: in VariableManager get_vars() 18699 1726882370.89383: Calling all_inventory to load vars for managed_node1 18699 1726882370.89384: Calling groups_inventory to load vars for managed_node1 18699 1726882370.89386: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882370.89389: Calling all_plugins_play to load vars for managed_node1 18699 1726882370.89390: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882370.89392: Calling groups_plugins_play to load vars for managed_node1 18699 1726882370.90121: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882370.90989: done with get_vars() 18699 1726882370.91012: done queuing things up, now waiting for results queue to drain 18699 1726882370.91014: results queue empty 18699 1726882370.91014: checking for any_errors_fatal 18699 1726882370.91016: done checking for any_errors_fatal 18699 1726882370.91017: checking for max_fail_percentage 18699 1726882370.91017: done checking for max_fail_percentage 18699 1726882370.91018: checking to see if all hosts have failed and the running result is not ok 18699 1726882370.91018: done checking to see if all hosts have failed 18699 1726882370.91019: getting the remaining hosts for this loop 18699 1726882370.91019: done getting the remaining hosts for this loop 18699 1726882370.91021: getting the next task for host managed_node1 18699 1726882370.91024: done getting next task for host managed_node1 18699 1726882370.91025: ^ task is: TASK: meta (flush_handlers) 18699 1726882370.91026: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882370.91028: getting variables 18699 1726882370.91028: in VariableManager get_vars() 18699 1726882370.91033: Calling all_inventory to load vars for managed_node1 18699 1726882370.91035: Calling groups_inventory to load vars for managed_node1 18699 1726882370.91036: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882370.91040: Calling all_plugins_play to load vars for managed_node1 18699 1726882370.91041: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882370.91043: Calling groups_plugins_play to load vars for managed_node1 18699 1726882370.91690: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882370.92542: done with get_vars() 18699 1726882370.92555: done getting variables 18699 1726882370.92591: in VariableManager get_vars() 18699 1726882370.92600: Calling all_inventory to load vars for managed_node1 18699 1726882370.92601: Calling groups_inventory to load vars for managed_node1 18699 1726882370.92603: Calling all_plugins_inventory to load vars for managed_node1 18699 1726882370.92606: Calling all_plugins_play to load vars for managed_node1 18699 1726882370.92607: Calling groups_plugins_inventory to load vars for managed_node1 18699 1726882370.92609: Calling groups_plugins_play to load vars for managed_node1 18699 1726882370.93292: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18699 1726882370.94153: done with get_vars() 18699 1726882370.94169: done queuing things up, now waiting for results queue to drain 18699 1726882370.94171: results queue empty 18699 1726882370.94171: checking for any_errors_fatal 18699 1726882370.94172: done checking for any_errors_fatal 18699 1726882370.94173: checking for max_fail_percentage 18699 1726882370.94173: done checking for max_fail_percentage 18699 1726882370.94174: checking to see if all hosts have failed and the running result is not ok 18699 1726882370.94174: done checking to see if all hosts have failed 18699 1726882370.94175: getting the remaining hosts for this loop 18699 1726882370.94175: done getting the remaining hosts for this loop 18699 1726882370.94177: getting the next task for host managed_node1 18699 1726882370.94179: done getting next task for host managed_node1 18699 1726882370.94180: ^ task is: None 18699 1726882370.94181: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18699 1726882370.94181: done queuing things up, now waiting for results queue to drain 18699 1726882370.94182: results queue empty 18699 1726882370.94182: checking for any_errors_fatal 18699 1726882370.94183: done checking for any_errors_fatal 18699 1726882370.94183: checking for max_fail_percentage 18699 1726882370.94184: done checking for max_fail_percentage 18699 1726882370.94184: checking to see if all hosts have failed and the running result is not ok 18699 1726882370.94185: done checking to see if all hosts have failed 18699 1726882370.94185: getting the next task for host managed_node1 18699 1726882370.94187: done getting next task for host managed_node1 18699 1726882370.94187: ^ task is: None 18699 1726882370.94188: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node1 : ok=82 changed=3 unreachable=0 failed=0 skipped=74 rescued=0 ignored=1 Friday 20 September 2024 21:32:50 -0400 (0:00:00.569) 0:00:44.538 ****** =============================================================================== fedora.linux_system_roles.network : Check which services are running ---- 1.92s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.92s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.85s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Gathering Facts --------------------------------------------------------- 1.72s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_nm.yml:6 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.51s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 1.27s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 fedora.linux_system_roles.network : Check which packages are installed --- 1.27s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 1.20s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:5 Create veth interface lsr27 --------------------------------------------- 1.18s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Gathering Facts --------------------------------------------------------- 1.12s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:68 Gathering Facts --------------------------------------------------------- 1.10s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:50 Gathering Facts --------------------------------------------------------- 1.08s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:3 Gathering Facts --------------------------------------------------------- 1.05s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 Gathering Facts --------------------------------------------------------- 1.04s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 fedora.linux_system_roles.network : Check which packages are installed --- 0.98s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 0.98s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:77 fedora.linux_system_roles.network : Check which packages are installed --- 0.91s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.85s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Install iproute --------------------------------------------------------- 0.84s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Gathering Facts --------------------------------------------------------- 0.84s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:13 18699 1726882370.94271: RUNNING CLEANUP